Source:
Source: Dr.P.Soundarapandian.M.D.,D.M (Senior Consultant Nephrologist), Apollo Hospitals, Managiri, Madurai Main Road, Karaikudi, Tamilnadu, India.
Creator: L.Jerlin Rubini(Research Scholar) Alagappa University, EmailId :jel.jerlin '@' gmail.com ContactNo :+91-9597231281
Guided by: Dr.P.Eswaran Assistant Professor, Department of Computer Science and Engineering, Alagappa University, Karaikudi, Tamilnadu, India. Emailid:eswaranperumal '@' gmail.com
Load the Data
Overview of the Data
Data Preparation
Exploratory Data Analysis
Model Building
Improve Model
The notebook is designed in such a way that you just need to plug in the input values given below and run the code. It will run on it's own and will build the model as well.
import warnings
from warnings import filterwarnings
filterwarnings('ignore')
# Input file name with path
input_file_name = 'kidney_disease.csv'
# Target class name
input_target_class = "class"
# Columns to be removed
input_drop_col = "id"
# Col datatype selection
input_datatype_selection = 'auto' # use auto if you don't want to provide column names by data type else use 'manual'
# Categorical columns
input_cat_columns = [
'red blood cells', 'pus cell', 'pus cell clumps', 'bacteria',
'packed cell volume',
'white blood cell count', 'red blood cell count', 'ypertension',
'diabetes mellitus', 'coronary artery disease', 'appetite',
'pedal edema', 'anemia', 'class']
# Numerical columns
input_num_columns = ['id', 'age', 'blood pressure', 'specific gravity', 'albumin', 'sugar','blood glucose random', 'blood urea', 'serum creatinine', 'sodium',
'potassium', 'haemoglobin']
# Encoding technique
input_encoding = 'LabelEncoder' # choose the encoding technique from 'LabelEncoder', 'OneHotEncoder', 'OrdinalEncoder' and 'FrequencyEncoder'
# Handle missing value
input_treat_missing_value = 'impute' # choose how to handle missing values from 'drop','inpute' and 'ignore'
In this section you will:
Import all the libraries in the first cell itself
from pyforest import *
# Import libraries
# Data Manipulation
import numpy as np
import pandas as pd
from pandas import DataFrame
# Data Visualization
import seaborn as sns
import matplotlib.pyplot as plt
# Machine Learning
from sklearn.datasets import load_breast_cancer
from sklearn.preprocessing import LabelEncoder, StandardScaler, OrdinalEncoder
from sklearn.impute import SimpleImputer
from sklearn.model_selection import train_test_split, GridSearchCV
from sklearn.metrics import confusion_matrix , classification_report, accuracy_score, roc_auc_score, plot_roc_curve
from sklearn.linear_model import LogisticRegression
from sklearn.tree import DecisionTreeClassifier
from sklearn.ensemble import RandomForestClassifier
from sklearn.neighbors import KNeighborsClassifier
from sklearn.discriminant_analysis import LinearDiscriminantAnalysis
from sklearn.naive_bayes import GaussianNB
from xgboost import XGBClassifier
from lightgbm import LGBMClassifier
from imblearn.over_sampling import RandomOverSampler
import pickle
from sklearn.feature_selection import SelectKBest#Also known as Information Gain
from sklearn.feature_selection import chi2
from sklearn.model_selection import RandomizedSearchCV
# Maths
import math
# Set the options
pd.set_option('display.max_rows', 800)
pd.set_option('display.max_columns', 500)
%matplotlib inline
Load the dataset using pd.read_csv()
# Read data in form of a csv file
df = pd.read_csv(input_file_name)
# First 5 rows of the dataset
df.head()
| id | age | bp | sg | al | su | rbc | pc | pcc | ba | bgr | bu | sc | sod | pot | hemo | pcv | wc | rc | htn | dm | cad | appet | pe | ane | classification | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0 | 48.0 | 80.0 | 1.020 | 1.0 | 0.0 | NaN | normal | notpresent | notpresent | 121.0 | 36.0 | 1.2 | NaN | NaN | 15.4 | 44 | 7800 | 5.2 | yes | yes | no | good | no | no | ckd |
| 1 | 1 | 7.0 | 50.0 | 1.020 | 4.0 | 0.0 | NaN | normal | notpresent | notpresent | NaN | 18.0 | 0.8 | NaN | NaN | 11.3 | 38 | 6000 | NaN | no | no | no | good | no | no | ckd |
| 2 | 2 | 62.0 | 80.0 | 1.010 | 2.0 | 3.0 | normal | normal | notpresent | notpresent | 423.0 | 53.0 | 1.8 | NaN | NaN | 9.6 | 31 | 7500 | NaN | no | yes | no | poor | no | yes | ckd |
| 3 | 3 | 48.0 | 70.0 | 1.005 | 4.0 | 0.0 | normal | abnormal | present | notpresent | 117.0 | 56.0 | 3.8 | 111.0 | 2.5 | 11.2 | 32 | 6700 | 3.9 | yes | no | no | poor | yes | yes | ckd |
| 4 | 4 | 51.0 | 80.0 | 1.010 | 2.0 | 0.0 | normal | normal | notpresent | notpresent | 106.0 | 26.0 | 1.4 | NaN | NaN | 11.6 | 35 | 7300 | 4.6 | no | no | no | good | no | no | ckd |
Before attempting to solve the problem, it's very important to have a good understanding of data.
In this section you will:
As the name says descriptive statistics describes the data. It gives you information about
Let's understand the data we have
# Dimension of the data
df.shape
(400, 26)
# Summary of the dataset
df.describe().T
| count | mean | std | min | 25% | 50% | 75% | max | |
|---|---|---|---|---|---|---|---|---|
| id | 400.0 | 199.500000 | 115.614301 | 0.000 | 99.75 | 199.50 | 299.25 | 399.000 |
| age | 391.0 | 51.483376 | 17.169714 | 2.000 | 42.00 | 55.00 | 64.50 | 90.000 |
| bp | 388.0 | 76.469072 | 13.683637 | 50.000 | 70.00 | 80.00 | 80.00 | 180.000 |
| sg | 353.0 | 1.017408 | 0.005717 | 1.005 | 1.01 | 1.02 | 1.02 | 1.025 |
| al | 354.0 | 1.016949 | 1.352679 | 0.000 | 0.00 | 0.00 | 2.00 | 5.000 |
| su | 351.0 | 0.450142 | 1.099191 | 0.000 | 0.00 | 0.00 | 0.00 | 5.000 |
| bgr | 356.0 | 148.036517 | 79.281714 | 22.000 | 99.00 | 121.00 | 163.00 | 490.000 |
| bu | 381.0 | 57.425722 | 50.503006 | 1.500 | 27.00 | 42.00 | 66.00 | 391.000 |
| sc | 383.0 | 3.072454 | 5.741126 | 0.400 | 0.90 | 1.30 | 2.80 | 76.000 |
| sod | 313.0 | 137.528754 | 10.408752 | 4.500 | 135.00 | 138.00 | 142.00 | 163.000 |
| pot | 312.0 | 4.627244 | 3.193904 | 2.500 | 3.80 | 4.40 | 4.90 | 47.000 |
| hemo | 348.0 | 12.526437 | 2.912587 | 3.100 | 10.30 | 12.65 | 15.00 | 17.800 |
columns=pd.read_csv('data_description.txt',sep='-')
columns=columns.reset_index()
columns.columns=['cols','abb_col_names']
columns
| cols | abb_col_names | |
|---|---|---|
| 0 | id | id |
| 1 | age | age |
| 2 | bp | blood pressure |
| 3 | sg | specific gravity |
| 4 | al | albumin |
| 5 | su | sugar |
| 6 | rbc | red blood cells |
| 7 | pc | pus cell |
| 8 | pcc | pus cell clumps |
| 9 | ba | bacteria |
| 10 | bgr | blood glucose random |
| 11 | bu | blood urea |
| 12 | sc | serum creatinine |
| 13 | sod | sodium |
| 14 | pot | potassium |
| 15 | hemo | haemoglobin |
| 16 | pcv | packed cell volume |
| 17 | wc | white blood cell count |
| 18 | rc | red blood cell count |
| 19 | htn | ypertension |
| 20 | dm | diabetes mellitus |
| 21 | cad | coronary artery disease |
| 22 | appet | appetite |
| 23 | pe | pedal edema |
| 24 | ane | anemia |
| 25 | classification | class |
df.columns=columns['abb_col_names'].values
df.head()
| id | age | blood pressure | specific gravity | albumin | sugar | red blood cells | pus cell | pus cell clumps | bacteria | blood glucose random | blood urea | serum creatinine | sodium | potassium | haemoglobin | packed cell volume | white blood cell count | red blood cell count | ypertension | diabetes mellitus | coronary artery disease | appetite | pedal edema | anemia | class | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0 | 48.0 | 80.0 | 1.020 | 1.0 | 0.0 | NaN | normal | notpresent | notpresent | 121.0 | 36.0 | 1.2 | NaN | NaN | 15.4 | 44 | 7800 | 5.2 | yes | yes | no | good | no | no | ckd |
| 1 | 1 | 7.0 | 50.0 | 1.020 | 4.0 | 0.0 | NaN | normal | notpresent | notpresent | NaN | 18.0 | 0.8 | NaN | NaN | 11.3 | 38 | 6000 | NaN | no | no | no | good | no | no | ckd |
| 2 | 2 | 62.0 | 80.0 | 1.010 | 2.0 | 3.0 | normal | normal | notpresent | notpresent | 423.0 | 53.0 | 1.8 | NaN | NaN | 9.6 | 31 | 7500 | NaN | no | yes | no | poor | no | yes | ckd |
| 3 | 3 | 48.0 | 70.0 | 1.005 | 4.0 | 0.0 | normal | abnormal | present | notpresent | 117.0 | 56.0 | 3.8 | 111.0 | 2.5 | 11.2 | 32 | 6700 | 3.9 | yes | no | no | poor | yes | yes | ckd |
| 4 | 4 | 51.0 | 80.0 | 1.010 | 2.0 | 0.0 | normal | normal | notpresent | notpresent | 106.0 | 26.0 | 1.4 | NaN | NaN | 11.6 | 35 | 7300 | 4.6 | no | no | no | good | no | no | ckd |
df.dtypes
id int64 age float64 blood pressure float64 specific gravity float64 albumin float64 sugar float64 red blood cells object pus cell object pus cell clumps object bacteria object blood glucose random float64 blood urea float64 serum creatinine float64 sodium float64 potassium float64 haemoglobin float64 packed cell volume object white blood cell count object red blood cell count object ypertension object diabetes mellitus object coronary artery disease object appetite object pedal edema object anemia object class object dtype: object
df.columns
Index(['id', 'age', 'blood pressure', 'specific gravity', 'albumin', 'sugar',
'red blood cells', 'pus cell', 'pus cell clumps', 'bacteria',
'blood glucose random', 'blood urea', 'serum creatinine', 'sodium',
'potassium', 'haemoglobin', 'packed cell volume',
'white blood cell count', 'red blood cell count', 'ypertension',
'diabetes mellitus', 'coronary artery disease', 'appetite',
'pedal edema', 'anemia', 'class'],
dtype='object')
df.columns = df.columns.str.replace(' ','_')
features=['red_blood_cell_count','packed_cell_volume','white_blood_cell_count']
def convert_dtype(df,feature):
df[feature] = pd.to_numeric(df[feature], errors='coerce')
for feature in features:
convert_dtype(df,feature)
df.dtypes
id int64 age float64 blood_pressure float64 specific_gravity float64 albumin float64 sugar float64 red_blood_cells object pus_cell object pus_cell_clumps object bacteria object blood_glucose_random float64 blood_urea float64 serum_creatinine float64 sodium float64 potassium float64 haemoglobin float64 packed_cell_volume float64 white_blood_cell_count float64 red_blood_cell_count float64 ypertension object diabetes_mellitus object coronary_artery_disease object appetite object pedal_edema object anemia object class object dtype: object
Get the info about missing values in the dataframe
# Missing values for every column
df.isna().sum()
id 0 age 9 blood_pressure 12 specific_gravity 47 albumin 46 sugar 49 red_blood_cells 152 pus_cell 65 pus_cell_clumps 4 bacteria 4 blood_glucose_random 44 blood_urea 19 serum_creatinine 17 sodium 87 potassium 88 haemoglobin 52 packed_cell_volume 71 white_blood_cell_count 106 red_blood_cell_count 131 ypertension 2 diabetes_mellitus 2 coronary_artery_disease 2 appetite 1 pedal_edema 1 anemia 1 class 0 dtype: int64
The data is not yet ready for model building. You need to process the data and make it ready for model building
In this section you will:
Machine Learning works on the idea of garbage in - garbage out. If you feed in dirty data, the results won't be good. Hence it's very important to clean the data before training the model.
Sklearn algorithms need missing value imputation but XGBoost, LightGBM etc does not require missing value imputation
There are various ways to handle missing values. Some of the ways are:
Here you can decide how you want to handle the missing data
# Select how you wish to treat missing values according to the input provided
if input_treat_missing_value == 'drop':
# drop rows with missing values
df.dropna(inplace=True)
print(df.shape)
elif input_treat_missing_value == 'impute':
# Impute missing values
for col in numerical_columns:
df[col] = df[col].fillna(df[col].median())
for col in categorical_columns:
mode = df[col].mode()[0]
df[col] = df[col].fillna(mode)
elif input_treat_missing_value == 'ignore':
print("Ignore missing values")
df.head()
| id | age | blood_pressure | specific_gravity | albumin | sugar | red_blood_cells | pus_cell | pus_cell_clumps | bacteria | blood_glucose_random | blood_urea | serum_creatinine | sodium | potassium | haemoglobin | packed_cell_volume | white_blood_cell_count | red_blood_cell_count | ypertension | diabetes_mellitus | coronary_artery_disease | appetite | pedal_edema | anemia | class | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0 | 48.0 | 80.0 | 1.020 | 1.0 | 0.0 | normal | normal | notpresent | notpresent | 121.0 | 36.0 | 1.2 | 138.0 | 4.4 | 15.4 | 44.0 | 7800.0 | 5.2 | yes | yes | no | good | no | no | ckd |
| 1 | 1 | 7.0 | 50.0 | 1.020 | 4.0 | 0.0 | normal | normal | notpresent | notpresent | 121.0 | 18.0 | 0.8 | 138.0 | 4.4 | 11.3 | 38.0 | 6000.0 | 4.8 | no | no | no | good | no | no | ckd |
| 2 | 2 | 62.0 | 80.0 | 1.010 | 2.0 | 3.0 | normal | normal | notpresent | notpresent | 423.0 | 53.0 | 1.8 | 138.0 | 4.4 | 9.6 | 31.0 | 7500.0 | 4.8 | no | yes | no | poor | no | yes | ckd |
| 3 | 3 | 48.0 | 70.0 | 1.005 | 4.0 | 0.0 | normal | abnormal | present | notpresent | 117.0 | 56.0 | 3.8 | 111.0 | 2.5 | 11.2 | 32.0 | 6700.0 | 3.9 | yes | no | no | poor | yes | yes | ckd |
| 4 | 4 | 51.0 | 80.0 | 1.010 | 2.0 | 0.0 | normal | normal | notpresent | notpresent | 106.0 | 26.0 | 1.4 | 138.0 | 4.4 | 11.6 | 35.0 | 7300.0 | 4.6 | no | no | no | good | no | no | ckd |
Encoding is the process of converting data from one form to another. Most of the Machine learning algorithms can not handle categorical values unless we convert them to numerical values. Many algorithm’s performances vary based on how Categorical columns are encoded.
There are lot of ways in which you can encode the categorical variables. Some of those are:
### total unique categories in our categorical features to check if any dirtiness in data or not
for col in categorical_columns:
print('{} has {} categories'.format(col, df[col].unique()))
appetite has [0 1] categories anemia has [0 1] categories bacteria has [0 1] categories coronary_artery_disease has [0 1] categories pus_cell has [1 0] categories pedal_edema has [0 1] categories class has [0 1] categories ypertension has [1 0] categories diabetes_mellitus has [1 0] categories red_blood_cells has [1 0] categories pus_cell_clumps has [0 1] categories
#Replace incorrect values
df['diabetes_mellitus'].replace(to_replace = {'\tno':'no','\tyes':'yes',' yes':'yes'},inplace=True)
df['coronary_artery_disease'] = df['coronary_artery_disease'].replace(to_replace = '\tno', value='no')
df['class'] = df['class'].replace(to_replace = 'ckd\t', value = 'ckd')
df.head()
| id | age | blood_pressure | specific_gravity | albumin | sugar | red_blood_cells | pus_cell | pus_cell_clumps | bacteria | blood_glucose_random | blood_urea | serum_creatinine | sodium | potassium | haemoglobin | packed_cell_volume | white_blood_cell_count | red_blood_cell_count | ypertension | diabetes_mellitus | coronary_artery_disease | appetite | pedal_edema | anemia | class | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 0 | 48.0 | 80.0 | 1.020 | 1.0 | 0.0 | normal | normal | notpresent | notpresent | 121.0 | 36.0 | 1.2 | 138.0 | 4.4 | 15.4 | 44.0 | 7800.0 | 5.2 | yes | yes | no | good | no | no | ckd |
| 1 | 1 | 7.0 | 50.0 | 1.020 | 4.0 | 0.0 | normal | normal | notpresent | notpresent | 121.0 | 18.0 | 0.8 | 138.0 | 4.4 | 11.3 | 38.0 | 6000.0 | 4.8 | no | no | no | good | no | no | ckd |
| 2 | 2 | 62.0 | 80.0 | 1.010 | 2.0 | 3.0 | normal | normal | notpresent | notpresent | 423.0 | 53.0 | 1.8 | 138.0 | 4.4 | 9.6 | 31.0 | 7500.0 | 4.8 | no | yes | no | poor | no | yes | ckd |
| 3 | 3 | 48.0 | 70.0 | 1.005 | 4.0 | 0.0 | normal | abnormal | present | notpresent | 117.0 | 56.0 | 3.8 | 111.0 | 2.5 | 11.2 | 32.0 | 6700.0 | 3.9 | yes | no | no | poor | yes | yes | ckd |
| 4 | 4 | 51.0 | 80.0 | 1.010 | 2.0 | 0.0 | normal | normal | notpresent | notpresent | 106.0 | 26.0 | 1.4 | 138.0 | 4.4 | 11.6 | 35.0 | 7300.0 | 4.6 | no | no | no | good | no | no | ckd |
Exploratory data analysis is an approach to analyze or investigate data sets to find out patterns and see if any of the variables can be useful in predicting the y variables. Visual methods are often used to summarise the data. Primarily EDA is for seeing what the data can tell us beyond the formal modeling or hypothesis testing tasks.
In this section you will:
It's better to get the list of columns by data types in the start itself. You won't have to manually write the name of columns while performing certain operations. So always get the list of columns in the start itself.
# Remove extra columns
col_remove = input_drop_col
df = df.drop(col_remove, axis = 1)
--------------------------------------------------------------------------- KeyError Traceback (most recent call last) /var/folders/z3/kr507s5946sfbvnysj54w95w0000gn/T/ipykernel_20830/1226646560.py in <module> 1 # Remove extra columns 2 col_remove = input_drop_col ----> 3 df = df.drop(col_remove, axis = 1) ~/miniforge3/envs/env_tensorflow/lib/python3.9/site-packages/pandas/util/_decorators.py in wrapper(*args, **kwargs) 309 stacklevel=stacklevel, 310 ) --> 311 return func(*args, **kwargs) 312 313 return wrapper ~/miniforge3/envs/env_tensorflow/lib/python3.9/site-packages/pandas/core/frame.py in drop(self, labels, axis, index, columns, level, inplace, errors) 4904 weight 1.0 0.8 4905 """ -> 4906 return super().drop( 4907 labels=labels, 4908 axis=axis, ~/miniforge3/envs/env_tensorflow/lib/python3.9/site-packages/pandas/core/generic.py in drop(self, labels, axis, index, columns, level, inplace, errors) 4148 for axis, labels in axes.items(): 4149 if labels is not None: -> 4150 obj = obj._drop_axis(labels, axis, level=level, errors=errors) 4151 4152 if inplace: ~/miniforge3/envs/env_tensorflow/lib/python3.9/site-packages/pandas/core/generic.py in _drop_axis(self, labels, axis, level, errors) 4183 new_axis = axis.drop(labels, level=level, errors=errors) 4184 else: -> 4185 new_axis = axis.drop(labels, errors=errors) 4186 result = self.reindex(**{axis_name: new_axis}) 4187 ~/miniforge3/envs/env_tensorflow/lib/python3.9/site-packages/pandas/core/indexes/base.py in drop(self, labels, errors) 6015 if mask.any(): 6016 if errors != "ignore": -> 6017 raise KeyError(f"{labels[mask]} not found in axis") 6018 indexer = indexer[~mask] 6019 return self.delete(indexer) KeyError: "['id'] not found in axis"
Note : There might be some mismatch in the data type of the columns, so in such cases you will have to correct it manually
You need to check the distribution of target class, see how many categories are there, is it balanced or not
# Check distribution of target class
sns.countplot(y=df[input_target_class] ,data=df)
plt.xlabel("Count of each Target class")
plt.ylabel("Target classes")
plt.show()
# Check the distribution of all the features
df.hist(figsize=(15,12),bins = 15)
plt.title("Features Distribution")
plt.show()
# box and whisker plots
df.plot(figsize=(15, 15), kind='box', subplots=True, layout=(8,8), sharex=False, sharey=False, fontsize=1)
plt.show()
# Heatmap of correlation between features
plt.figure(figsize = (20, 20))
sns.heatmap(df.drop(['class'], axis = 1).corr(), annot = True)
<AxesSubplot:>
Dive deeper on correlations, since some correlations could determine target feature(class) and demonstrated by visualization
Positive Correlation: specific_gravity -> red_blood_cell_count, packed_cell_volume, haemoglobin sugar -> blood_glucose_random blood_urea -> serum_creatinine haemoglobin -> red_blood_cell_count <- packed_cell_volume
Negative Correlation: Albumin, Blood urea -> Red blood cell count, packed cell volume, Haemoglobin Serum creatinine -> Sodium
# define kde_plot
def kde_plot(feature):
grid = sns.FacetGrid(df, hue="class",aspect=2)
grid.map(sns.kdeplot, feature)
grid.add_legend()
pos_features = df[['specific_gravity', 'red_blood_cell_count', 'packed_cell_volume', 'haemoglobin', 'sugar', 'blood_glucose_random',
'blood_urea', 'serum_creatinine']]
pos_features[pos_features.columns]
| specific_gravity | red_blood_cell_count | packed_cell_volume | haemoglobin | sugar | blood_glucose_random | blood_urea | serum_creatinine | |
|---|---|---|---|---|---|---|---|---|
| 0 | 1.020 | 5.2 | 44.0 | 15.40 | 0.0 | 121.0 | 36.0 | 1.20 |
| 1 | 1.020 | 4.8 | 38.0 | 11.30 | 0.0 | 121.0 | 18.0 | 0.80 |
| 2 | 1.010 | 4.8 | 31.0 | 9.60 | 3.0 | 423.0 | 53.0 | 1.80 |
| 3 | 1.005 | 3.9 | 32.0 | 11.20 | 0.0 | 117.0 | 56.0 | 3.80 |
| 4 | 1.010 | 4.6 | 35.0 | 11.60 | 0.0 | 106.0 | 26.0 | 1.40 |
| 5 | 1.015 | 4.4 | 39.0 | 12.20 | 0.0 | 74.0 | 25.0 | 1.10 |
| 6 | 1.010 | 4.8 | 36.0 | 12.40 | 0.0 | 100.0 | 54.0 | 24.00 |
| 7 | 1.015 | 5.0 | 44.0 | 12.40 | 4.0 | 410.0 | 31.0 | 1.10 |
| 8 | 1.015 | 4.0 | 33.0 | 10.80 | 0.0 | 138.0 | 60.0 | 1.90 |
| 9 | 1.020 | 3.7 | 29.0 | 9.50 | 0.0 | 70.0 | 107.0 | 7.20 |
| 10 | 1.010 | 4.8 | 28.0 | 9.40 | 4.0 | 490.0 | 55.0 | 4.00 |
| 11 | 1.010 | 3.8 | 32.0 | 10.80 | 0.0 | 380.0 | 60.0 | 2.70 |
| 12 | 1.015 | 3.4 | 28.0 | 9.70 | 1.0 | 208.0 | 72.0 | 2.10 |
| 13 | 1.020 | 4.8 | 40.0 | 9.80 | 0.0 | 98.0 | 86.0 | 4.60 |
| 14 | 1.010 | 2.6 | 16.0 | 5.60 | 2.0 | 157.0 | 90.0 | 4.10 |
| 15 | 1.015 | 2.8 | 24.0 | 7.60 | 0.0 | 76.0 | 162.0 | 9.60 |
| 16 | 1.015 | 4.8 | 40.0 | 12.60 | 0.0 | 99.0 | 46.0 | 2.20 |
| 17 | 1.020 | 4.8 | 40.0 | 12.10 | 0.0 | 114.0 | 87.0 | 5.20 |
| 18 | 1.025 | 4.3 | 37.0 | 12.70 | 3.0 | 263.0 | 27.0 | 1.30 |
| 19 | 1.015 | 3.7 | 30.0 | 10.30 | 0.0 | 100.0 | 31.0 | 1.60 |
| 20 | 1.015 | 3.2 | 24.0 | 7.70 | 0.0 | 173.0 | 148.0 | 3.90 |
| 21 | 1.020 | 3.6 | 32.0 | 10.90 | 0.0 | 121.0 | 180.0 | 76.00 |
| 22 | 1.025 | 3.4 | 32.0 | 9.80 | 0.0 | 95.0 | 163.0 | 7.70 |
| 23 | 1.010 | 4.8 | 40.0 | 12.65 | 0.0 | 121.0 | 42.0 | 1.30 |
| 24 | 1.015 | 4.6 | 39.0 | 11.10 | 0.0 | 121.0 | 50.0 | 1.40 |
| 25 | 1.025 | 3.7 | 29.0 | 9.90 | 0.0 | 108.0 | 75.0 | 1.90 |
| 26 | 1.015 | 4.0 | 35.0 | 11.60 | 0.0 | 156.0 | 45.0 | 2.40 |
| 27 | 1.010 | 4.1 | 37.0 | 12.50 | 4.0 | 264.0 | 87.0 | 2.70 |
| 28 | 1.020 | 4.8 | 40.0 | 12.65 | 3.0 | 123.0 | 31.0 | 1.40 |
| 29 | 1.005 | 4.8 | 38.0 | 12.90 | 0.0 | 121.0 | 28.0 | 1.40 |
| 30 | 1.020 | 4.8 | 40.0 | 12.65 | 0.0 | 93.0 | 155.0 | 7.30 |
| 31 | 1.015 | 4.0 | 30.0 | 10.10 | 0.0 | 107.0 | 33.0 | 1.50 |
| 32 | 1.010 | 4.0 | 34.0 | 11.30 | 1.0 | 159.0 | 39.0 | 1.50 |
| 33 | 1.020 | 4.8 | 29.0 | 10.10 | 0.0 | 140.0 | 55.0 | 2.50 |
| 34 | 1.010 | 4.8 | 40.0 | 12.65 | 0.0 | 171.0 | 153.0 | 5.20 |
| 35 | 1.020 | 4.9 | 36.0 | 12.00 | 1.0 | 270.0 | 39.0 | 2.00 |
| 36 | 1.015 | 4.8 | 32.0 | 10.30 | 0.0 | 92.0 | 29.0 | 1.80 |
| 37 | 1.020 | 2.5 | 28.0 | 9.70 | 0.0 | 137.0 | 65.0 | 3.40 |
| 38 | 1.020 | 4.8 | 40.0 | 12.50 | 0.0 | 121.0 | 103.0 | 4.10 |
| 39 | 1.010 | 4.2 | 40.0 | 13.00 | 2.0 | 140.0 | 70.0 | 3.40 |
| 40 | 1.010 | 4.1 | 32.0 | 11.10 | 0.0 | 99.0 | 80.0 | 2.10 |
| 41 | 1.010 | 4.8 | 40.0 | 12.65 | 0.0 | 121.0 | 20.0 | 0.70 |
| 42 | 1.010 | 4.5 | 33.0 | 9.70 | 0.0 | 204.0 | 29.0 | 1.00 |
| 43 | 1.010 | 3.1 | 24.0 | 7.90 | 0.0 | 79.0 | 202.0 | 10.80 |
| 44 | 1.010 | 4.8 | 28.0 | 9.70 | 0.0 | 207.0 | 77.0 | 6.30 |
| 45 | 1.020 | 4.8 | 40.0 | 9.30 | 0.0 | 208.0 | 89.0 | 5.90 |
| 46 | 1.015 | 4.7 | 37.0 | 12.40 | 0.0 | 124.0 | 24.0 | 1.20 |
| 47 | 1.010 | 4.8 | 45.0 | 15.00 | 0.0 | 121.0 | 17.0 | 0.80 |
| 48 | 1.005 | 3.5 | 29.0 | 10.00 | 0.0 | 70.0 | 32.0 | 0.90 |
| 49 | 1.010 | 3.5 | 29.0 | 9.70 | 0.0 | 144.0 | 72.0 | 3.00 |
| 50 | 1.020 | 3.8 | 28.0 | 8.60 | 0.0 | 91.0 | 114.0 | 3.25 |
| 51 | 1.015 | 4.8 | 33.0 | 10.30 | 0.0 | 162.0 | 66.0 | 1.60 |
| 52 | 1.015 | 3.7 | 34.0 | 10.90 | 0.0 | 121.0 | 38.0 | 2.20 |
| 53 | 1.015 | 4.7 | 40.0 | 13.60 | 5.0 | 246.0 | 24.0 | 1.00 |
| 54 | 1.010 | 4.2 | 40.0 | 13.00 | 2.0 | 121.0 | 42.0 | 3.40 |
| 55 | 1.005 | 4.8 | 28.0 | 9.50 | 0.0 | 121.0 | 42.0 | 1.30 |
| 56 | 1.015 | 3.4 | 30.0 | 10.20 | 4.0 | 121.0 | 164.0 | 9.70 |
| 57 | 1.020 | 4.8 | 40.0 | 12.65 | 0.0 | 93.0 | 155.0 | 7.30 |
| 58 | 1.020 | 4.3 | 33.0 | 10.50 | 0.0 | 253.0 | 142.0 | 4.60 |
| 59 | 1.020 | 4.8 | 40.0 | 6.60 | 0.0 | 121.0 | 96.0 | 6.40 |
| 60 | 1.020 | 4.8 | 40.0 | 12.65 | 0.0 | 141.0 | 66.0 | 3.20 |
| 61 | 1.010 | 4.8 | 40.0 | 12.65 | 3.0 | 182.0 | 391.0 | 32.00 |
| 62 | 1.020 | 3.8 | 33.0 | 11.00 | 0.0 | 86.0 | 15.0 | 0.60 |
| 63 | 1.015 | 4.8 | 27.0 | 7.50 | 0.0 | 150.0 | 111.0 | 6.10 |
| 64 | 1.010 | 4.8 | 40.0 | 9.80 | 0.0 | 146.0 | 42.0 | 1.30 |
| 65 | 1.010 | 4.8 | 48.0 | 15.00 | 0.0 | 121.0 | 20.0 | 1.10 |
| 66 | 1.020 | 4.8 | 40.0 | 12.65 | 0.0 | 150.0 | 55.0 | 1.60 |
| 67 | 1.020 | 4.8 | 40.0 | 12.65 | 0.0 | 425.0 | 42.0 | 1.30 |
| 68 | 1.010 | 4.8 | 37.0 | 10.90 | 0.0 | 112.0 | 73.0 | 3.30 |
| 69 | 1.015 | 6.0 | 52.0 | 15.60 | 4.0 | 250.0 | 20.0 | 1.10 |
| 70 | 1.015 | 5.2 | 44.0 | 15.20 | 4.0 | 360.0 | 19.0 | 0.70 |
| 71 | 1.010 | 3.2 | 28.0 | 9.80 | 0.0 | 163.0 | 92.0 | 3.30 |
| 72 | 1.010 | 4.8 | 40.0 | 10.30 | 3.0 | 121.0 | 35.0 | 1.30 |
| 73 | 1.015 | 4.8 | 14.0 | 4.80 | 0.0 | 129.0 | 107.0 | 6.70 |
| 74 | 1.015 | 3.4 | 29.0 | 9.10 | 0.0 | 129.0 | 107.0 | 6.70 |
| 75 | 1.015 | 4.8 | 40.0 | 8.10 | 0.0 | 121.0 | 16.0 | 0.70 |
| 76 | 1.005 | 4.0 | 36.0 | 10.30 | 0.0 | 133.0 | 139.0 | 8.50 |
| 77 | 1.010 | 3.7 | 34.0 | 11.90 | 0.0 | 102.0 | 48.0 | 3.20 |
| 78 | 1.020 | 4.8 | 30.0 | 10.10 | 0.0 | 158.0 | 85.0 | 3.20 |
| 79 | 1.010 | 5.0 | 40.0 | 13.50 | 0.0 | 165.0 | 55.0 | 1.80 |
| 80 | 1.010 | 3.8 | 31.0 | 10.80 | 0.0 | 132.0 | 98.0 | 2.80 |
| 81 | 1.020 | 3.7 | 29.0 | 8.30 | 0.0 | 360.0 | 45.0 | 2.40 |
| 82 | 1.020 | 4.8 | 40.0 | 12.65 | 0.0 | 104.0 | 77.0 | 1.90 |
| 83 | 1.015 | 4.8 | 40.0 | 12.65 | 0.0 | 127.0 | 19.0 | 1.00 |
| 84 | 1.010 | 2.1 | 22.0 | 7.10 | 0.0 | 76.0 | 186.0 | 15.00 |
| 85 | 1.015 | 4.8 | 40.0 | 9.90 | 0.0 | 121.0 | 46.0 | 1.50 |
| 86 | 1.020 | 4.8 | 40.0 | 12.65 | 0.0 | 415.0 | 37.0 | 1.90 |
| 87 | 1.005 | 5.0 | 32.0 | 11.10 | 0.0 | 169.0 | 47.0 | 2.90 |
| 88 | 1.010 | 4.7 | 40.0 | 12.65 | 0.0 | 251.0 | 52.0 | 2.20 |
| 89 | 1.020 | 4.8 | 40.0 | 12.65 | 0.0 | 109.0 | 32.0 | 1.40 |
| 90 | 1.010 | 4.2 | 40.0 | 13.00 | 2.0 | 280.0 | 35.0 | 3.20 |
| 91 | 1.015 | 5.6 | 52.0 | 16.10 | 1.0 | 210.0 | 26.0 | 1.70 |
| 92 | 1.010 | 3.6 | 33.0 | 10.40 | 0.0 | 219.0 | 82.0 | 3.60 |
| 93 | 1.010 | 3.2 | 30.0 | 9.20 | 2.0 | 295.0 | 90.0 | 5.60 |
| 94 | 1.010 | 3.9 | 36.0 | 11.60 | 0.0 | 93.0 | 66.0 | 1.60 |
| 95 | 1.015 | 4.8 | 40.0 | 12.65 | 0.0 | 94.0 | 25.0 | 1.10 |
| 96 | 1.010 | 4.8 | 36.0 | 11.20 | 1.0 | 172.0 | 32.0 | 2.70 |
| 97 | 1.015 | 4.0 | 32.0 | 10.00 | 0.0 | 91.0 | 51.0 | 2.20 |
| 98 | 1.020 | 2.3 | 18.0 | 6.20 | 0.0 | 101.0 | 106.0 | 6.50 |
| 99 | 1.020 | 4.2 | 32.0 | 11.20 | 4.0 | 298.0 | 24.0 | 1.20 |
| 100 | 1.015 | 4.8 | 40.0 | 12.65 | 0.0 | 153.0 | 22.0 | 0.90 |
| 101 | 1.015 | 3.9 | 33.0 | 11.30 | 0.0 | 88.0 | 80.0 | 4.40 |
| 102 | 1.010 | 4.8 | 52.0 | 13.90 | 0.0 | 92.0 | 32.0 | 2.10 |
| 103 | 1.015 | 4.2 | 36.0 | 10.20 | 0.0 | 226.0 | 217.0 | 10.20 |
| 104 | 1.020 | 4.8 | 40.0 | 12.65 | 0.0 | 143.0 | 88.0 | 2.00 |
| 105 | 1.015 | 5.2 | 42.0 | 14.10 | 0.0 | 115.0 | 32.0 | 11.50 |
| 106 | 1.020 | 4.8 | 17.0 | 6.00 | 0.0 | 89.0 | 118.0 | 6.10 |
| 107 | 1.015 | 4.4 | 34.0 | 11.20 | 4.0 | 297.0 | 53.0 | 2.80 |
| 108 | 1.015 | 4.2 | 37.0 | 11.80 | 0.0 | 107.0 | 15.0 | 1.00 |
| 109 | 1.020 | 4.8 | 40.0 | 11.70 | 0.0 | 233.0 | 50.1 | 1.90 |
| 110 | 1.015 | 4.7 | 34.0 | 11.70 | 0.0 | 123.0 | 19.0 | 2.00 |
| 111 | 1.010 | 3.9 | 32.0 | 10.00 | 3.0 | 294.0 | 71.0 | 4.40 |
| 112 | 1.015 | 4.8 | 33.0 | 10.80 | 0.0 | 121.0 | 34.0 | 1.20 |
| 113 | 1.015 | 4.8 | 40.0 | 12.65 | 2.0 | 121.0 | 42.0 | 1.30 |
| 114 | 1.015 | 4.8 | 40.0 | 12.10 | 0.0 | 121.0 | 51.0 | 1.80 |
| 115 | 1.010 | 4.3 | 44.0 | 12.40 | 0.0 | 121.0 | 28.0 | 0.90 |
| 116 | 1.015 | 4.8 | 40.0 | 12.65 | 0.0 | 104.0 | 16.0 | 0.50 |
| 117 | 1.020 | 4.4 | 37.0 | 12.50 | 0.0 | 219.0 | 36.0 | 1.30 |
| 118 | 1.010 | 4.8 | 40.0 | 11.40 | 0.0 | 99.0 | 25.0 | 1.20 |
| 119 | 1.010 | 4.8 | 40.0 | 12.65 | 0.0 | 140.0 | 27.0 | 1.20 |
| 120 | 1.025 | 4.8 | 40.0 | 12.60 | 3.0 | 323.0 | 40.0 | 2.20 |
| 121 | 1.020 | 4.8 | 46.0 | 15.00 | 0.0 | 125.0 | 21.0 | 1.30 |
| 122 | 1.020 | 4.8 | 40.0 | 6.00 | 0.0 | 121.0 | 219.0 | 12.20 |
| 123 | 1.015 | 4.8 | 42.0 | 14.00 | 3.0 | 121.0 | 30.0 | 1.10 |
| 124 | 1.015 | 3.6 | 28.0 | 9.10 | 0.0 | 90.0 | 98.0 | 2.50 |
| 125 | 1.020 | 4.8 | 40.0 | 12.65 | 0.0 | 308.0 | 36.0 | 2.50 |
| 126 | 1.015 | 4.5 | 37.0 | 12.00 | 0.0 | 144.0 | 125.0 | 4.00 |
| 127 | 1.015 | 4.3 | 35.0 | 11.40 | 0.0 | 118.0 | 125.0 | 5.30 |
| 128 | 1.015 | 2.9 | 23.0 | 8.10 | 3.0 | 224.0 | 166.0 | 5.60 |
| 129 | 1.025 | 4.8 | 40.0 | 11.10 | 0.0 | 158.0 | 49.0 | 1.40 |
| 130 | 1.010 | 2.7 | 22.0 | 8.20 | 0.0 | 128.0 | 208.0 | 9.20 |
| 131 | 1.010 | 4.8 | 36.0 | 11.80 | 0.0 | 121.0 | 25.0 | 0.60 |
| 132 | 1.020 | 2.7 | 24.0 | 8.60 | 0.0 | 219.0 | 176.0 | 13.80 |
| 133 | 1.015 | 8.0 | 37.0 | 12.00 | 0.0 | 118.0 | 125.0 | 5.30 |
| 134 | 1.010 | 3.8 | 33.0 | 10.80 | 0.0 | 122.0 | 42.0 | 16.90 |
| 135 | 1.015 | 4.8 | 39.0 | 13.20 | 2.0 | 214.0 | 24.0 | 1.30 |
| 136 | 1.020 | 4.8 | 40.0 | 9.30 | 0.0 | 213.0 | 68.0 | 2.80 |
| 137 | 1.010 | 4.8 | 29.0 | 10.00 | 0.0 | 268.0 | 86.0 | 4.00 |
| 138 | 1.010 | 4.8 | 40.0 | 12.65 | 0.0 | 95.0 | 51.0 | 1.60 |
| 139 | 1.015 | 4.8 | 33.0 | 11.10 | 0.0 | 121.0 | 68.0 | 2.80 |
| 140 | 1.010 | 4.8 | 40.0 | 12.65 | 4.0 | 256.0 | 40.0 | 1.20 |
| 141 | 1.010 | 4.8 | 19.0 | 6.10 | 0.0 | 121.0 | 106.0 | 6.00 |
| 142 | 1.020 | 4.8 | 40.0 | 12.65 | 0.0 | 84.0 | 145.0 | 7.10 |
| 143 | 1.015 | 4.8 | 40.0 | 12.65 | 4.0 | 210.0 | 165.0 | 18.00 |
| 144 | 1.010 | 4.1 | 33.0 | 11.10 | 0.0 | 105.0 | 53.0 | 2.30 |
| 145 | 1.015 | 3.3 | 24.0 | 8.00 | 0.0 | 121.0 | 322.0 | 13.00 |
| 146 | 1.010 | 4.8 | 40.0 | 12.65 | 3.0 | 213.0 | 23.0 | 1.00 |
| 147 | 1.010 | 3.0 | 25.0 | 7.90 | 1.0 | 288.0 | 36.0 | 1.70 |
| 148 | 1.020 | 4.8 | 40.0 | 12.65 | 0.0 | 171.0 | 26.0 | 48.10 |
| 149 | 1.020 | 4.8 | 32.0 | 10.50 | 0.0 | 139.0 | 29.0 | 1.00 |
| 150 | 1.025 | 4.8 | 41.0 | 12.30 | 0.0 | 78.0 | 27.0 | 0.90 |
| 151 | 1.020 | 4.8 | 30.0 | 9.60 | 0.0 | 172.0 | 46.0 | 1.70 |
| 152 | 1.010 | 4.8 | 32.0 | 10.90 | 0.0 | 121.0 | 20.0 | 0.80 |
| 153 | 1.010 | 2.9 | 22.0 | 8.30 | 1.0 | 273.0 | 235.0 | 14.20 |
| 154 | 1.005 | 3.0 | 26.0 | 8.40 | 3.0 | 242.0 | 132.0 | 16.40 |
| 155 | 1.020 | 4.8 | 36.0 | 11.10 | 0.0 | 123.0 | 40.0 | 1.80 |
| 156 | 1.015 | 4.8 | 40.0 | 12.65 | 0.0 | 153.0 | 76.0 | 3.30 |
| 157 | 1.025 | 3.9 | 39.0 | 12.60 | 0.0 | 122.0 | 42.0 | 1.70 |
| 158 | 1.020 | 4.8 | 31.0 | 10.90 | 2.0 | 424.0 | 48.0 | 1.50 |
| 159 | 1.010 | 4.3 | 35.0 | 10.40 | 0.0 | 303.0 | 35.0 | 1.30 |
| 160 | 1.020 | 2.4 | 35.0 | 10.90 | 0.0 | 148.0 | 39.0 | 2.10 |
| 161 | 1.015 | 4.8 | 42.0 | 14.30 | 0.0 | 121.0 | 42.0 | 1.30 |
| 162 | 1.020 | 4.8 | 37.0 | 9.80 | 0.0 | 204.0 | 34.0 | 1.50 |
| 163 | 1.010 | 3.2 | 27.0 | 9.00 | 0.0 | 160.0 | 40.0 | 2.00 |
| 164 | 1.015 | 5.4 | 40.0 | 14.30 | 0.0 | 192.0 | 15.0 | 0.80 |
| 165 | 1.020 | 4.8 | 40.0 | 12.65 | 2.0 | 121.0 | 42.0 | 1.30 |
| 166 | 1.020 | 4.8 | 40.0 | 12.65 | 0.0 | 76.0 | 44.0 | 3.90 |
| 167 | 1.020 | 4.8 | 42.0 | 12.70 | 0.0 | 139.0 | 19.0 | 0.90 |
| 168 | 1.015 | 4.8 | 39.0 | 11.00 | 4.0 | 307.0 | 28.0 | 1.50 |
| 169 | 1.010 | 4.8 | 27.0 | 8.70 | 2.0 | 220.0 | 68.0 | 2.80 |
| 170 | 1.015 | 4.4 | 33.0 | 12.50 | 5.0 | 447.0 | 41.0 | 1.70 |
| 171 | 1.020 | 3.1 | 26.0 | 8.70 | 0.0 | 102.0 | 60.0 | 2.60 |
| 172 | 1.010 | 4.9 | 34.0 | 10.60 | 2.0 | 309.0 | 113.0 | 2.90 |
| 173 | 1.015 | 4.8 | 41.0 | 13.10 | 0.0 | 22.0 | 1.5 | 7.30 |
| 174 | 1.020 | 4.6 | 35.0 | 11.00 | 0.0 | 111.0 | 146.0 | 7.50 |
| 175 | 1.010 | 3.4 | 40.0 | 12.65 | 0.0 | 261.0 | 58.0 | 2.20 |
| 176 | 1.010 | 3.9 | 23.0 | 8.30 | 0.0 | 107.0 | 40.0 | 1.70 |
| 177 | 1.015 | 4.8 | 41.0 | 13.20 | 1.0 | 215.0 | 133.0 | 2.50 |
| 178 | 1.020 | 4.8 | 34.0 | 9.80 | 0.0 | 93.0 | 153.0 | 2.70 |
| 179 | 1.010 | 4.8 | 39.0 | 11.90 | 0.0 | 124.0 | 53.0 | 2.30 |
| 180 | 1.010 | 4.8 | 28.0 | 10.30 | 4.0 | 234.0 | 56.0 | 1.90 |
| 181 | 1.025 | 3.7 | 30.0 | 10.00 | 0.0 | 117.0 | 52.0 | 2.20 |
| 182 | 1.020 | 4.8 | 35.0 | 11.30 | 0.0 | 131.0 | 23.0 | 0.80 |
| 183 | 1.015 | 4.8 | 40.0 | 12.65 | 0.0 | 101.0 | 106.0 | 6.50 |
| 184 | 1.015 | 3.6 | 31.0 | 11.30 | 2.0 | 352.0 | 137.0 | 3.30 |
| 185 | 1.020 | 4.8 | 34.0 | 12.00 | 0.0 | 99.0 | 23.0 | 0.60 |
| 186 | 1.020 | 4.8 | 40.0 | 12.65 | 0.0 | 121.0 | 46.0 | 1.00 |
| 187 | 1.010 | 4.8 | 34.0 | 10.70 | 0.0 | 121.0 | 22.0 | 0.70 |
| 188 | 1.020 | 4.8 | 38.0 | 12.20 | 0.0 | 80.0 | 66.0 | 2.50 |
| 189 | 1.010 | 3.4 | 29.0 | 9.50 | 1.0 | 239.0 | 58.0 | 4.30 |
| 190 | 1.010 | 4.8 | 30.0 | 9.90 | 0.0 | 94.0 | 67.0 | 1.00 |
| 191 | 1.010 | 3.4 | 26.0 | 9.10 | 0.0 | 110.0 | 115.0 | 6.00 |
| 192 | 1.015 | 4.8 | 40.0 | 12.65 | 0.0 | 130.0 | 16.0 | 0.90 |
| 193 | 1.025 | 2.8 | 15.0 | 5.50 | 0.0 | 121.0 | 223.0 | 18.10 |
| 194 | 1.010 | 4.8 | 40.0 | 12.65 | 0.0 | 121.0 | 49.0 | 1.20 |
| 195 | 1.020 | 4.8 | 40.0 | 5.80 | 1.0 | 184.0 | 98.6 | 3.30 |
| 196 | 1.010 | 3.5 | 24.0 | 8.10 | 0.0 | 129.0 | 158.0 | 11.80 |
| 197 | 1.020 | 3.0 | 40.0 | 6.80 | 0.0 | 121.0 | 111.0 | 9.30 |
| 198 | 1.020 | 3.9 | 30.0 | 11.20 | 2.0 | 252.0 | 40.0 | 3.20 |
| 199 | 1.015 | 3.2 | 25.0 | 8.80 | 0.0 | 92.0 | 37.0 | 1.50 |
| 200 | 1.025 | 3.9 | 37.0 | 12.00 | 0.0 | 139.0 | 89.0 | 3.00 |
| 201 | 1.020 | 4.8 | 21.0 | 7.90 | 0.0 | 113.0 | 94.0 | 7.30 |
| 202 | 1.020 | 4.8 | 24.0 | 8.00 | 0.0 | 114.0 | 74.0 | 2.90 |
| 203 | 1.020 | 4.8 | 40.0 | 8.50 | 0.0 | 207.0 | 80.0 | 6.80 |
| 204 | 1.010 | 4.8 | 31.0 | 8.80 | 2.0 | 172.0 | 82.0 | 13.50 |
| 205 | 1.020 | 4.8 | 43.0 | 12.60 | 0.0 | 100.0 | 28.0 | 2.10 |
| 206 | 1.010 | 4.8 | 41.0 | 13.80 | 0.0 | 109.0 | 96.0 | 3.90 |
| 207 | 1.010 | 4.6 | 41.0 | 12.00 | 0.0 | 230.0 | 50.0 | 2.20 |
| 208 | 1.020 | 4.9 | 41.0 | 12.30 | 0.0 | 341.0 | 37.0 | 1.50 |
| 209 | 1.020 | 4.8 | 40.0 | 11.50 | 0.0 | 121.0 | 42.0 | 1.30 |
| 210 | 1.015 | 3.9 | 20.0 | 7.30 | 2.0 | 255.0 | 132.0 | 12.80 |
| 211 | 1.015 | 4.8 | 40.0 | 12.65 | 0.0 | 103.0 | 18.0 | 1.20 |
| 212 | 1.015 | 3.4 | 31.0 | 10.90 | 4.0 | 253.0 | 150.0 | 11.90 |
| 213 | 1.010 | 3.7 | 34.0 | 10.90 | 1.0 | 214.0 | 73.0 | 3.90 |
| 214 | 1.015 | 5.2 | 43.0 | 13.70 | 0.0 | 171.0 | 30.0 | 1.00 |
| 215 | 1.010 | 4.8 | 40.0 | 12.65 | 0.0 | 121.0 | 42.0 | 1.30 |
| 216 | 1.010 | 4.8 | 38.0 | 12.80 | 0.0 | 107.0 | 15.0 | 1.30 |
| 217 | 1.010 | 4.3 | 36.0 | 12.20 | 0.0 | 78.0 | 61.0 | 1.80 |
| 218 | 1.015 | 4.8 | 34.0 | 11.80 | 0.0 | 92.0 | 19.0 | 0.80 |
| 219 | 1.010 | 3.3 | 28.0 | 9.80 | 0.0 | 238.0 | 57.0 | 2.50 |
| 220 | 1.010 | 4.8 | 36.0 | 11.90 | 0.0 | 103.0 | 42.0 | 1.30 |
| 221 | 1.020 | 4.8 | 40.0 | 12.65 | 0.0 | 248.0 | 30.0 | 1.70 |
| 222 | 1.020 | 4.8 | 40.0 | 12.65 | 0.0 | 108.0 | 68.0 | 1.80 |
| 223 | 1.010 | 4.6 | 38.0 | 13.00 | 3.0 | 303.0 | 30.0 | 1.30 |
| 224 | 1.020 | 4.8 | 40.0 | 12.65 | 0.0 | 117.0 | 28.0 | 2.20 |
| 225 | 1.010 | 4.5 | 35.0 | 11.50 | 5.0 | 490.0 | 95.0 | 2.70 |
| 226 | 1.015 | 3.4 | 26.0 | 7.90 | 2.0 | 163.0 | 54.0 | 7.20 |
| 227 | 1.015 | 3.8 | 36.0 | 11.30 | 0.0 | 120.0 | 48.0 | 1.60 |
| 228 | 1.020 | 4.8 | 40.0 | 12.65 | 0.0 | 124.0 | 52.0 | 2.50 |
| 229 | 1.010 | 3.8 | 31.0 | 9.60 | 0.0 | 241.0 | 191.0 | 12.00 |
| 230 | 1.010 | 4.8 | 40.0 | 12.65 | 0.0 | 192.0 | 17.0 | 1.70 |
| 231 | 1.020 | 4.8 | 35.0 | 11.50 | 0.0 | 269.0 | 51.0 | 2.80 |
| 232 | 1.015 | 4.8 | 40.0 | 12.65 | 0.0 | 121.0 | 42.0 | 1.30 |
| 233 | 1.015 | 4.8 | 40.0 | 12.65 | 0.0 | 93.0 | 20.0 | 1.60 |
| 234 | 1.010 | 5.2 | 44.0 | 15.00 | 0.0 | 121.0 | 19.0 | 1.30 |
| 235 | 1.010 | 4.8 | 26.0 | 7.90 | 0.0 | 113.0 | 93.0 | 2.30 |
| 236 | 1.020 | 4.8 | 25.0 | 9.10 | 0.0 | 74.0 | 66.0 | 2.00 |
| 237 | 1.015 | 4.8 | 40.0 | 12.70 | 2.0 | 141.0 | 53.0 | 2.20 |
| 238 | 1.020 | 4.8 | 28.0 | 9.40 | 0.0 | 201.0 | 241.0 | 13.40 |
| 239 | 1.015 | 4.8 | 39.0 | 11.90 | 0.0 | 104.0 | 50.0 | 1.60 |
| 240 | 1.015 | 4.1 | 36.0 | 11.40 | 0.0 | 203.0 | 46.0 | 1.40 |
| 241 | 1.015 | 3.9 | 31.0 | 10.40 | 0.0 | 165.0 | 45.0 | 1.50 |
| 242 | 1.010 | 3.3 | 28.0 | 9.40 | 3.0 | 214.0 | 96.0 | 6.30 |
| 243 | 1.020 | 6.1 | 47.0 | 13.40 | 1.0 | 169.0 | 48.0 | 2.40 |
| 244 | 1.015 | 4.6 | 40.0 | 12.20 | 2.0 | 463.0 | 64.0 | 2.80 |
| 245 | 1.020 | 2.6 | 19.0 | 6.30 | 0.0 | 103.0 | 79.0 | 5.30 |
| 246 | 1.015 | 2.5 | 26.0 | 8.60 | 0.0 | 106.0 | 215.0 | 15.20 |
| 247 | 1.025 | 4.8 | 40.0 | 12.65 | 0.0 | 150.0 | 18.0 | 1.20 |
| 248 | 1.010 | 4.1 | 37.0 | 12.60 | 3.0 | 424.0 | 55.0 | 1.70 |
| 249 | 1.010 | 2.1 | 9.0 | 3.10 | 1.0 | 176.0 | 309.0 | 13.30 |
| 250 | 1.025 | 4.5 | 48.0 | 15.00 | 0.0 | 140.0 | 10.0 | 1.20 |
| 251 | 1.025 | 5.0 | 52.0 | 17.00 | 0.0 | 70.0 | 36.0 | 1.00 |
| 252 | 1.025 | 4.7 | 46.0 | 15.90 | 0.0 | 82.0 | 49.0 | 0.60 |
| 253 | 1.025 | 6.2 | 42.0 | 15.40 | 0.0 | 119.0 | 17.0 | 1.20 |
| 254 | 1.025 | 5.2 | 49.0 | 13.00 | 0.0 | 99.0 | 38.0 | 0.80 |
| 255 | 1.025 | 6.3 | 52.0 | 13.60 | 0.0 | 121.0 | 27.0 | 1.20 |
| 256 | 1.025 | 5.1 | 41.0 | 14.50 | 0.0 | 131.0 | 10.0 | 0.50 |
| 257 | 1.020 | 5.8 | 46.0 | 14.00 | 0.0 | 91.0 | 36.0 | 0.70 |
| 258 | 1.020 | 5.5 | 44.0 | 13.90 | 0.0 | 98.0 | 20.0 | 0.50 |
| 259 | 1.020 | 5.2 | 45.0 | 16.10 | 0.0 | 104.0 | 31.0 | 1.20 |
| 260 | 1.020 | 5.3 | 45.0 | 14.10 | 0.0 | 131.0 | 38.0 | 1.00 |
| 261 | 1.020 | 4.9 | 41.0 | 17.00 | 0.0 | 122.0 | 32.0 | 1.20 |
| 262 | 1.020 | 5.4 | 43.0 | 15.50 | 0.0 | 118.0 | 18.0 | 0.90 |
| 263 | 1.020 | 5.2 | 45.0 | 16.20 | 0.0 | 117.0 | 46.0 | 1.20 |
| 264 | 1.020 | 4.5 | 50.0 | 14.40 | 0.0 | 132.0 | 24.0 | 0.70 |
| 265 | 1.020 | 5.0 | 48.0 | 14.20 | 0.0 | 97.0 | 40.0 | 0.60 |
| 266 | 1.020 | 5.3 | 41.0 | 13.20 | 0.0 | 133.0 | 17.0 | 1.20 |
| 267 | 1.025 | 4.8 | 48.0 | 13.90 | 0.0 | 122.0 | 33.0 | 0.90 |
| 268 | 1.020 | 4.9 | 53.0 | 16.30 | 0.0 | 100.0 | 49.0 | 1.00 |
| 269 | 1.025 | 5.3 | 48.0 | 15.00 | 0.0 | 121.0 | 19.0 | 1.20 |
| 270 | 1.025 | 5.0 | 41.0 | 14.30 | 0.0 | 111.0 | 34.0 | 1.10 |
| 271 | 1.025 | 4.5 | 42.0 | 13.80 | 0.0 | 96.0 | 25.0 | 0.50 |
| 272 | 1.025 | 5.5 | 42.0 | 14.80 | 0.0 | 139.0 | 15.0 | 1.20 |
| 273 | 1.020 | 4.8 | 40.0 | 12.65 | 0.0 | 95.0 | 35.0 | 0.90 |
| 274 | 1.020 | 4.8 | 44.0 | 14.40 | 0.0 | 107.0 | 23.0 | 0.70 |
| 275 | 1.020 | 4.6 | 43.0 | 16.50 | 0.0 | 125.0 | 22.0 | 1.20 |
| 276 | 1.025 | 5.5 | 41.0 | 14.00 | 0.0 | 121.0 | 42.0 | 1.30 |
| 277 | 1.025 | 4.8 | 50.0 | 15.70 | 0.0 | 123.0 | 46.0 | 1.00 |
| 278 | 1.020 | 6.4 | 44.0 | 14.50 | 0.0 | 112.0 | 44.0 | 1.20 |
| 279 | 1.025 | 5.6 | 48.0 | 16.30 | 0.0 | 140.0 | 23.0 | 0.60 |
| 280 | 1.020 | 5.2 | 52.0 | 13.30 | 0.0 | 93.0 | 33.0 | 0.90 |
| 281 | 1.025 | 6.0 | 41.0 | 15.50 | 0.0 | 130.0 | 50.0 | 1.20 |
| 282 | 1.020 | 4.8 | 44.0 | 14.60 | 0.0 | 123.0 | 44.0 | 1.00 |
| 283 | 1.020 | 5.7 | 43.0 | 16.40 | 0.0 | 121.0 | 42.0 | 1.30 |
| 284 | 1.025 | 6.0 | 52.0 | 16.90 | 0.0 | 100.0 | 37.0 | 1.20 |
| 285 | 1.020 | 5.9 | 41.0 | 16.00 | 0.0 | 94.0 | 19.0 | 0.70 |
| 286 | 1.020 | 6.0 | 44.0 | 14.70 | 0.0 | 81.0 | 18.0 | 0.80 |
| 287 | 1.025 | 4.8 | 43.0 | 13.40 | 0.0 | 124.0 | 22.0 | 0.60 |
| 288 | 1.025 | 5.1 | 50.0 | 15.90 | 0.0 | 70.0 | 46.0 | 1.20 |
| 289 | 1.020 | 5.3 | 43.0 | 16.60 | 0.0 | 93.0 | 32.0 | 0.90 |
| 290 | 1.020 | 5.9 | 52.0 | 14.80 | 0.0 | 76.0 | 28.0 | 0.60 |
| 291 | 1.025 | 5.7 | 41.0 | 14.90 | 0.0 | 124.0 | 44.0 | 1.00 |
| 292 | 1.020 | 5.0 | 52.0 | 16.70 | 0.0 | 89.0 | 42.0 | 0.50 |
| 293 | 1.020 | 5.4 | 48.0 | 14.90 | 0.0 | 92.0 | 19.0 | 1.20 |
| 294 | 1.020 | 5.8 | 40.0 | 14.30 | 0.0 | 110.0 | 50.0 | 0.70 |
| 295 | 1.020 | 6.5 | 50.0 | 15.00 | 0.0 | 106.0 | 25.0 | 0.90 |
| 296 | 1.020 | 5.9 | 41.0 | 16.80 | 0.0 | 125.0 | 38.0 | 0.60 |
| 297 | 1.025 | 5.2 | 45.0 | 15.80 | 0.0 | 116.0 | 26.0 | 1.00 |
| 298 | 1.020 | 4.9 | 48.0 | 13.50 | 0.0 | 91.0 | 49.0 | 1.20 |
| 299 | 1.020 | 4.7 | 52.0 | 15.10 | 0.0 | 127.0 | 48.0 | 0.50 |
| 300 | 1.020 | 5.8 | 43.0 | 15.00 | 0.0 | 114.0 | 26.0 | 0.70 |
| 301 | 1.025 | 5.0 | 41.0 | 16.90 | 0.0 | 96.0 | 33.0 | 0.90 |
| 302 | 1.020 | 4.8 | 48.0 | 14.80 | 0.0 | 127.0 | 44.0 | 1.20 |
| 303 | 1.020 | 6.1 | 50.0 | 17.00 | 0.0 | 107.0 | 26.0 | 1.10 |
| 304 | 1.025 | 4.5 | 45.0 | 13.10 | 0.0 | 128.0 | 38.0 | 0.60 |
| 305 | 1.020 | 5.2 | 41.0 | 17.10 | 0.0 | 122.0 | 25.0 | 0.80 |
| 306 | 1.020 | 5.7 | 52.0 | 15.20 | 0.0 | 128.0 | 30.0 | 1.20 |
| 307 | 1.020 | 4.5 | 44.0 | 13.60 | 0.0 | 137.0 | 17.0 | 0.50 |
| 308 | 1.025 | 4.9 | 48.0 | 13.90 | 0.0 | 81.0 | 46.0 | 0.60 |
| 309 | 1.020 | 5.9 | 40.0 | 17.20 | 0.0 | 129.0 | 25.0 | 1.20 |
| 310 | 1.020 | 5.4 | 44.0 | 13.20 | 0.0 | 102.0 | 27.0 | 0.70 |
| 311 | 1.025 | 5.6 | 45.0 | 13.70 | 0.0 | 132.0 | 18.0 | 1.10 |
| 312 | 1.020 | 6.1 | 48.0 | 15.30 | 0.0 | 121.0 | 42.0 | 1.30 |
| 313 | 1.020 | 4.8 | 52.0 | 17.30 | 0.0 | 104.0 | 28.0 | 0.90 |
| 314 | 1.025 | 4.7 | 41.0 | 15.60 | 0.0 | 131.0 | 46.0 | 0.60 |
| 315 | 1.025 | 4.4 | 48.0 | 13.80 | 0.0 | 121.0 | 42.0 | 1.30 |
| 316 | 1.020 | 5.2 | 48.0 | 15.40 | 0.0 | 99.0 | 30.0 | 0.50 |
| 317 | 1.020 | 4.9 | 40.0 | 15.00 | 0.0 | 102.0 | 48.0 | 1.20 |
| 318 | 1.025 | 5.3 | 52.0 | 17.40 | 0.0 | 120.0 | 29.0 | 0.70 |
| 319 | 1.020 | 4.8 | 40.0 | 12.65 | 0.0 | 138.0 | 15.0 | 1.10 |
| 320 | 1.020 | 6.2 | 44.0 | 15.70 | 0.0 | 105.0 | 49.0 | 1.20 |
| 321 | 1.020 | 4.8 | 48.0 | 13.90 | 0.0 | 109.0 | 39.0 | 1.00 |
| 322 | 1.020 | 4.9 | 43.0 | 16.00 | 0.0 | 120.0 | 40.0 | 0.50 |
| 323 | 1.025 | 4.5 | 45.0 | 15.90 | 0.0 | 130.0 | 30.0 | 1.10 |
| 324 | 1.020 | 4.8 | 40.0 | 12.65 | 0.0 | 119.0 | 15.0 | 0.70 |
| 325 | 1.020 | 6.5 | 50.0 | 14.00 | 0.0 | 100.0 | 50.0 | 1.20 |
| 326 | 1.020 | 5.2 | 41.0 | 15.80 | 0.0 | 109.0 | 25.0 | 1.10 |
| 327 | 1.025 | 5.8 | 44.0 | 13.40 | 0.0 | 120.0 | 31.0 | 0.80 |
| 328 | 1.020 | 6.5 | 45.0 | 12.65 | 0.0 | 131.0 | 29.0 | 0.60 |
| 329 | 1.025 | 5.1 | 48.0 | 14.10 | 0.0 | 80.0 | 25.0 | 0.90 |
| 330 | 1.020 | 4.8 | 42.0 | 12.65 | 0.0 | 114.0 | 32.0 | 1.10 |
| 331 | 1.025 | 4.5 | 46.0 | 13.50 | 0.0 | 130.0 | 39.0 | 0.70 |
| 332 | 1.025 | 6.1 | 44.0 | 15.30 | 0.0 | 121.0 | 33.0 | 1.00 |
| 333 | 1.020 | 5.5 | 46.0 | 17.70 | 0.0 | 99.0 | 46.0 | 1.20 |
| 334 | 1.025 | 4.5 | 43.0 | 15.40 | 0.0 | 125.0 | 42.0 | 1.30 |
| 335 | 1.020 | 5.6 | 48.0 | 14.20 | 0.0 | 134.0 | 45.0 | 0.50 |
| 336 | 1.020 | 5.2 | 40.0 | 15.20 | 0.0 | 119.0 | 27.0 | 0.50 |
| 337 | 1.025 | 6.2 | 52.0 | 14.00 | 0.0 | 92.0 | 40.0 | 0.90 |
| 338 | 1.020 | 4.5 | 44.0 | 17.80 | 0.0 | 132.0 | 34.0 | 0.80 |
| 339 | 1.020 | 4.9 | 48.0 | 13.30 | 0.0 | 88.0 | 42.0 | 0.50 |
| 340 | 1.025 | 5.9 | 43.0 | 14.30 | 0.0 | 100.0 | 29.0 | 1.10 |
| 341 | 1.025 | 4.7 | 41.0 | 13.40 | 0.0 | 130.0 | 37.0 | 0.90 |
| 342 | 1.020 | 6.3 | 50.0 | 15.00 | 0.0 | 95.0 | 46.0 | 0.50 |
| 343 | 1.025 | 5.7 | 50.0 | 16.20 | 0.0 | 111.0 | 35.0 | 0.80 |
| 344 | 1.020 | 4.7 | 42.0 | 14.40 | 0.0 | 106.0 | 27.0 | 0.70 |
| 345 | 1.025 | 6.4 | 42.0 | 13.50 | 0.0 | 97.0 | 18.0 | 1.20 |
| 346 | 1.020 | 5.8 | 52.0 | 15.50 | 0.0 | 130.0 | 41.0 | 0.90 |
| 347 | 1.025 | 5.5 | 43.0 | 17.80 | 0.0 | 108.0 | 25.0 | 1.00 |
| 348 | 1.020 | 6.4 | 44.0 | 13.60 | 0.0 | 99.0 | 19.0 | 0.50 |
| 349 | 1.025 | 6.1 | 52.0 | 14.50 | 0.0 | 82.0 | 36.0 | 1.10 |
| 350 | 1.025 | 4.5 | 43.0 | 16.10 | 0.0 | 85.0 | 20.0 | 1.00 |
| 351 | 1.020 | 4.7 | 40.0 | 17.50 | 0.0 | 83.0 | 49.0 | 0.90 |
| 352 | 1.020 | 5.2 | 48.0 | 15.00 | 0.0 | 109.0 | 47.0 | 1.10 |
| 353 | 1.020 | 4.5 | 51.0 | 13.60 | 0.0 | 86.0 | 37.0 | 0.60 |
| 354 | 1.025 | 5.1 | 41.0 | 14.60 | 0.0 | 102.0 | 17.0 | 0.40 |
| 355 | 1.020 | 4.6 | 52.0 | 15.00 | 0.0 | 95.0 | 24.0 | 0.80 |
| 356 | 1.025 | 6.1 | 47.0 | 17.10 | 0.0 | 87.0 | 38.0 | 0.50 |
| 357 | 1.025 | 4.9 | 42.0 | 13.60 | 0.0 | 107.0 | 16.0 | 1.10 |
| 358 | 1.020 | 5.6 | 45.0 | 13.00 | 0.0 | 117.0 | 22.0 | 1.20 |
| 359 | 1.020 | 4.5 | 53.0 | 17.20 | 0.0 | 88.0 | 50.0 | 0.60 |
| 360 | 1.025 | 6.2 | 43.0 | 14.70 | 0.0 | 105.0 | 39.0 | 0.50 |
| 361 | 1.020 | 5.8 | 54.0 | 13.70 | 0.0 | 70.0 | 16.0 | 0.70 |
| 362 | 1.025 | 4.8 | 40.0 | 15.00 | 0.0 | 89.0 | 19.0 | 1.10 |
| 363 | 1.025 | 5.2 | 44.0 | 17.80 | 0.0 | 99.0 | 40.0 | 0.50 |
| 364 | 1.025 | 4.7 | 45.0 | 14.80 | 0.0 | 118.0 | 44.0 | 0.70 |
| 365 | 1.020 | 6.3 | 40.0 | 12.65 | 0.0 | 93.0 | 46.0 | 1.00 |
| 366 | 1.025 | 5.3 | 46.0 | 15.00 | 0.0 | 81.0 | 15.0 | 0.50 |
| 367 | 1.025 | 6.1 | 50.0 | 17.40 | 0.0 | 125.0 | 41.0 | 1.10 |
| 368 | 1.025 | 5.9 | 45.0 | 14.90 | 0.0 | 82.0 | 42.0 | 0.70 |
| 369 | 1.020 | 4.8 | 46.0 | 13.60 | 0.0 | 107.0 | 48.0 | 0.80 |
| 370 | 1.020 | 5.4 | 50.0 | 16.20 | 0.0 | 83.0 | 42.0 | 1.20 |
| 371 | 1.025 | 5.0 | 51.0 | 17.60 | 0.0 | 79.0 | 50.0 | 0.50 |
| 372 | 1.020 | 5.5 | 52.0 | 15.00 | 0.0 | 109.0 | 26.0 | 0.90 |
| 373 | 1.025 | 4.9 | 47.0 | 13.70 | 0.0 | 133.0 | 38.0 | 1.00 |
| 374 | 1.025 | 6.4 | 40.0 | 16.30 | 0.0 | 111.0 | 44.0 | 1.20 |
| 375 | 1.020 | 5.6 | 48.0 | 15.10 | 0.0 | 74.0 | 41.0 | 0.50 |
| 376 | 1.025 | 5.2 | 53.0 | 16.40 | 0.0 | 88.0 | 16.0 | 1.10 |
| 377 | 1.020 | 4.8 | 49.0 | 13.80 | 0.0 | 97.0 | 27.0 | 0.70 |
| 378 | 1.025 | 5.5 | 42.0 | 15.20 | 0.0 | 121.0 | 42.0 | 0.90 |
| 379 | 1.025 | 5.7 | 50.0 | 16.10 | 0.0 | 78.0 | 45.0 | 0.60 |
| 380 | 1.020 | 4.9 | 54.0 | 15.30 | 0.0 | 113.0 | 23.0 | 1.10 |
| 381 | 1.025 | 5.9 | 40.0 | 16.60 | 0.0 | 79.0 | 47.0 | 0.50 |
| 382 | 1.025 | 6.5 | 51.0 | 16.80 | 0.0 | 75.0 | 22.0 | 0.80 |
| 383 | 1.025 | 5.0 | 49.0 | 13.90 | 0.0 | 119.0 | 46.0 | 0.70 |
| 384 | 1.020 | 4.5 | 42.0 | 15.40 | 0.0 | 132.0 | 18.0 | 1.10 |
| 385 | 1.020 | 5.1 | 52.0 | 16.50 | 0.0 | 113.0 | 25.0 | 0.60 |
| 386 | 1.025 | 6.5 | 43.0 | 16.40 | 0.0 | 100.0 | 47.0 | 0.50 |
| 387 | 1.025 | 5.2 | 50.0 | 16.70 | 0.0 | 93.0 | 17.0 | 0.90 |
| 388 | 1.020 | 6.4 | 46.0 | 15.50 | 0.0 | 94.0 | 15.0 | 1.20 |
| 389 | 1.025 | 5.8 | 52.0 | 17.00 | 0.0 | 112.0 | 48.0 | 0.70 |
| 390 | 1.025 | 5.3 | 52.0 | 15.00 | 0.0 | 99.0 | 25.0 | 0.80 |
| 391 | 1.025 | 6.3 | 44.0 | 15.60 | 0.0 | 85.0 | 16.0 | 1.10 |
| 392 | 1.020 | 5.5 | 46.0 | 14.80 | 0.0 | 133.0 | 48.0 | 1.20 |
| 393 | 1.025 | 5.4 | 54.0 | 13.00 | 0.0 | 117.0 | 45.0 | 0.70 |
| 394 | 1.020 | 4.6 | 45.0 | 14.10 | 0.0 | 137.0 | 46.0 | 0.80 |
| 395 | 1.020 | 4.9 | 47.0 | 15.70 | 0.0 | 140.0 | 49.0 | 0.50 |
| 396 | 1.025 | 6.2 | 54.0 | 16.50 | 0.0 | 75.0 | 31.0 | 1.20 |
| 397 | 1.020 | 5.4 | 49.0 | 15.80 | 0.0 | 100.0 | 26.0 | 0.60 |
| 398 | 1.025 | 5.9 | 51.0 | 14.20 | 0.0 | 114.0 | 50.0 | 1.00 |
| 399 | 1.025 | 6.1 | 53.0 | 15.80 | 0.0 | 131.0 | 18.0 | 1.10 |
for pos_feature in pos_features:
kde_plot(pos_feature)
import plotly.express as px
# Defining violin and scatter plot
def violin(col):
fig = px.violin(df, y=col, x="class", color="class", box=True)
return fig.show()
def scatters(col1,col2):
fig = px.scatter(df, x=col1, y=col2, color="class")
return fig.show()
for pos_feature in pos_features:
violin(pos_feature)
## look at scatters of some features for pos correlation
# fig,axes = plt.subplots(1,3,figsize=(10,5))
# axes[0].scatter(df['red_blood_cell_count'], df['packed_cell_volume'], c=df['class'])
# axes[0].set_xlabel('red_blood_cell_count')
# axes[0].set_ylabel('packed_cell_volume')
# axes[1].scatter(df['red_blood_cell_count'], df['haemoglobin'], c=df['class'])
# axes[1].set_xlabel('red_blood_cell_count')
# axes[1].set_ylabel('haemoglobin')
# axes[2].scatter(df['packed_cell_volume'], df['haemoglobin'], c=df['class'])
# axes[2].set_xlabel('packed_cell_volume')
# axes[2].set_ylabel('haemoglobin')
# fig.subplots_adjust(wspace=0.4)
# plt.show()
scatters('red_blood_cell_count', 'packed_cell_volume')
scatters('red_blood_cell_count', 'haemoglobin')
scatters('haemoglobin','packed_cell_volume')
pos_names = ['specific_gravity', 'red_blood_cell_count', 'packed_cell_volume', 'haemoglobin', 'sugar', 'blood_glucose_random',
'blood_urea', 'serum_creatinine']
df_pos_names = df[pos_names]
plt.figure(figsize=(10,10))
df_pos_names.plot(figsize=(20,10), kind='density', subplots=True, layout=(2,4), sharex=False)
plt.tight_layout()
<Figure size 720x720 with 0 Axes>
## Negative corr visualization
scatters('red_blood_cell_count','albumin')
scatters('packed_cell_volume','blood_urea')
fig = px.bar(df, x="specific_gravity", y="packed_cell_volume",
color='class', barmode='group',
height=400)
fig.show()
sns.pairplot(df, hue = 'class', palette = 'CMRmap')
<seaborn.axisgrid.PairGrid at 0x297e7f640>
# Number of rows and columns in the plot
n_cols = 3
n_rows = math.ceil(len(numerical_columns)/n_cols)
# Check the distribution of y variable corresponding to every x variable
fig,ax = plt.subplots(nrows = n_rows, ncols = n_cols, figsize=(30,30))
row = 0
col = 0
for i in numerical_columns:
if col > 2:
row += 1
col = 0
axes = ax[row,col]
sns.boxplot(x = df[input_target_class], y = df[i],ax = axes)
col += 1
plt.tight_layout()
plt.title("Individual Features by Class")
plt.show()
df.columns.values
array(['age', 'blood_pressure', 'specific_gravity', 'albumin', 'sugar',
'red_blood_cells', 'pus_cell', 'pus_cell_clumps', 'bacteria',
'blood_glucose_random', 'blood_urea', 'serum_creatinine', 'sodium',
'potassium', 'haemoglobin', 'packed_cell_volume',
'white_blood_cell_count', 'red_blood_cell_count', 'ypertension',
'diabetes_mellitus', 'coronary_artery_disease', 'appetite',
'pedal_edema', 'anemia', 'class'], dtype=object)
categorical_columns.remove(input_target_class)
categorical_columns
['appetite', 'anemia', 'bacteria', 'coronary_artery_disease', 'pus_cell', 'pedal_edema', 'ypertension', 'diabetes_mellitus', 'red_blood_cells', 'pus_cell_clumps']
# Get the list of numeric and categorical columns according to the input
if input_datatype_selection == "auto":
binary_columns = [col for col in df.columns if df[col].nunique() == 2]
print("Binary Columns : ", binary_columns)
categorical_columns = [col for col in df.columns if df[col].dtype == "object"]
print("Categorical Columns : ", categorical_columns)
categorical_columns = binary_columns + categorical_columns
categorical_columns = list(set(categorical_columns))
numerical_columns = [col for col in df.columns if col not in categorical_columns]
print("Numerical Columns : ", numerical_columns)
else:
categorical_columns = input_cat_columns
print("Categorical Columns : ", categorical_columns)
numerical_columns = input_num_columns
print("Numerical Columns : ", numerical_columns)
Binary Columns : ['red_blood_cells', 'pus_cell', 'pus_cell_clumps', 'bacteria', 'ypertension', 'diabetes_mellitus', 'coronary_artery_disease', 'appetite', 'pedal_edema', 'anemia', 'class'] Categorical Columns : [] Numerical Columns : ['age', 'blood_pressure', 'specific_gravity', 'albumin', 'sugar', 'blood_glucose_random', 'blood_urea', 'serum_creatinine', 'sodium', 'potassium', 'haemoglobin', 'packed_cell_volume', 'white_blood_cell_count', 'red_blood_cell_count']
# Select the encoding technique according to the input provided
if input_encoding == "LabelEncoder":
# Use LabelEncoder function from sklearn
le = LabelEncoder()
df[categorical_columns] = df[categorical_columns].apply(lambda col: le.fit_transform(col))
elif input_encoding == "OneHotEncoder":
# Use pandas get dummies function to one hot encode
df = pd.get_dummies(df, columns=categorical_columns)
elif input_encoding == "OrdinalEncoder":
# Use OrdinalEncoder function from sklearn
oe = OrdinalEncoder()
df[categorical_columns] = oe.fit_transform(df[categorical_columns])
elif input_encoding == "FrequencyEncoder":
# Frequency encode
for variable in categorical_columns:
# group by frequency
fq = df.groupby(variable).size()/len(df)
# mapping values to dataframe
df.loc[:, "{}".format(variable)] = df[variable].map(fq)
In this section you will:
Split the X and y dataset
# Split the y variable series and x variables dataset
X = df.drop([input_target_class],axis=1)
y = df[input_target_class]
It is a technique to standardize the x variables (features) present in the data in a fixed range. It needs to be done before training the model.
But if you are using tree based models, you should not go for feature scaling
# Define the function to scale the data using StandardScaler()
def scale_data(data):
scaler = StandardScaler()
# transform data
scaled_data = scaler.fit_transform(data)
scaled_data = DataFrame(scaled_data)
scaled_data.columns = data.columns
return scaled_data
# Scale X dataset
scaled_X = scale_data(X)
scaled_X.head()
| age | blood_pressure | specific_gravity | albumin | sugar | red_blood_cells | pus_cell | pus_cell_clumps | bacteria | blood_glucose_random | blood_urea | serum_creatinine | sodium | potassium | haemoglobin | packed_cell_volume | white_blood_cell_count | red_blood_cell_count | ypertension | diabetes_mellitus | coronary_artery_disease | appetite | pedal_edema | anemia | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | -0.210031 | 0.254214 | 0.421486 | 0.076249 | -0.380269 | 0.36489 | 0.484322 | -0.342518 | -0.241249 | -0.320122 | -0.419451 | -0.319668 | 0.040104 | -0.062903 | 1.053226 | 0.603224 | -0.197314 | 0.550044 | 1.311903 | 1.385535 | -0.304789 | -0.507801 | -0.484322 | -0.420084 |
| 1 | -2.627234 | -1.972476 | 0.421486 | 2.363728 | -0.380269 | 0.36489 | 0.484322 | -0.342518 | -0.241249 | -0.320122 | -0.784315 | -0.390819 | 0.040104 | -0.062903 | -0.457965 | -0.132789 | -0.909782 | 0.074073 | -0.762252 | -0.721743 | -0.304789 | -0.507801 | -0.484322 | -0.420084 |
| 2 | 0.615355 | 0.254214 | -1.421074 | 0.838742 | 2.507853 | 0.36489 | 0.484322 | -0.342518 | -0.241249 | 3.697618 | -0.074858 | -0.212942 | 0.040104 | -0.062903 | -1.084556 | -0.991470 | -0.316059 | 0.074073 | -0.762252 | 1.385535 | -0.304789 | 1.969276 | -0.484322 | 2.380476 |
| 3 | -0.210031 | -0.488016 | -2.342354 | 2.363728 | -0.380269 | 0.36489 | -2.064742 | 2.919556 | -0.241249 | -0.373337 | -0.014047 | 0.142813 | -2.896333 | -0.737181 | -0.494823 | -0.868801 | -0.632711 | -0.996862 | 1.311903 | -0.721743 | -0.304789 | 1.969276 | 2.064742 | 2.380476 |
| 4 | -0.033163 | 0.254214 | -1.421074 | 0.838742 | -0.380269 | 0.36489 | 0.484322 | -0.342518 | -0.241249 | -0.519679 | -0.622154 | -0.284093 | 0.040104 | -0.062903 | -0.347390 | -0.500795 | -0.395222 | -0.163913 | -0.762252 | -0.721743 | -0.304789 | -0.507801 | -0.484322 | -0.420084 |
Split the dataset in training and test set
# Split the dataset into the training set and test set
X_train, X_test, y_train, y_test = train_test_split(scaled_X, y, test_size = 0.3, random_state = 0)
Train the model on training data
# Spot-Check Algorithms
models = []
models.append(( 'LR' , LogisticRegression()))
models.append(( 'LDA' , LinearDiscriminantAnalysis()))
models.append(( 'KNN' , KNeighborsClassifier()))
models.append(( 'CART' , DecisionTreeClassifier()))
models.append(('RFC', RandomForestClassifier()))
models.append(('XGB', XGBClassifier()))
models.append(( 'NB' , GaussianNB()))
models.append(('LGB', LGBMClassifier()))
# Training the model:
score_results = []
score_names = []
num_folds = 10
scorings = ['accuracy', 'roc_auc']
for name, model in models:
kfold = KFold(n_splits=num_folds)
for scoring in scorings:
cv_results = cross_val_score(model, X_train, y_train, cv=kfold, scoring=scoring)
msg = "%s,%s: %f (%f)" % (name, scoring, cv_results.mean(), cv_results.std())
print(msg)
score_names.append(name)
score_results.append(cv_results)
LR,accuracy: 1.000000 (0.000000)
LR,roc_auc: 1.000000 (0.000000)
LDA,accuracy: 0.953571 (0.027894)
LDA,roc_auc: 0.994787 (0.008848)
KNN,accuracy: 0.964286 (0.027664)
KNN,roc_auc: 0.999745 (0.000765)
CART,accuracy: 0.967857 (0.029667)
CART,roc_auc: 0.960722 (0.038872)
RFC,accuracy: 0.989286 (0.022868)
RFC,roc_auc: 0.999490 (0.001531)
[21:01:47] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:01:48] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:01:48] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:01:49] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:01:49] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:01:49] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:01:50] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:01:50] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:01:50] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:01:51] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. XGB,accuracy: 0.975000 (0.022868)
[21:01:51] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:01:51] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:01:51] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:01:52] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:01:52] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:01:52] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:01:53] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:01:53] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:01:53] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:01:54] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. XGB,roc_auc: 0.995695 (0.007977)
NB,accuracy: 0.960714 (0.037287)
NB,roc_auc: 0.967389 (0.032010)
LGB,accuracy: 0.978571 (0.017496)
LGB,roc_auc: 0.998905 (0.002196)
Get the predictions from the model on testing data
Get the evaluation metrics to evaluate the performance of model on testing data
# Define a function to compute various evaluation metrics
def compute_evaluation_metric(model, x_test, y_actual, y_predicted, y_predicted_prob):
print("\n Accuracy Score : \n ",accuracy_score(y_actual,y_predicted))
print("\n AUC Score : \n", roc_auc_score(y_actual, y_predicted_prob))
print("\n Confusion Matrix : \n ",confusion_matrix(y_actual, y_predicted))
print("\n Classification Report : \n",classification_report(y_actual, y_predicted))
print("\n ROC curve : \n")
sns.set_style("white")
plot_roc_curve(model, x_test, y_actual)
plt.show()
# Training the model and evaluate, prediction:
for name, model in models:
model.fit(X_train, y_train)
# Predict class for test dataset
y_pred = model.predict(X_test)
# Predict probability for test dataset
y_pred_prod = model.predict_proba(X_test)
y_pred_prod = [x[1] for x in y_pred_prod]
print(name,"Y predicted : ",y_pred)
print(name,"Y probability predicted : ",y_pred_prod[:5])
compute_evaluation_metric(model, X_test, y_test, y_pred, y_pred_prod)
LR Y predicted : [0 1 1 0 0 0 0 1 0 0 0 0 0 1 0 0 1 0 0 0 0 1 0 1 0 1 0 0 1 1 0 1 1 0 0 0 0
0 0 1 0 1 0 0 1 0 1 1 0 0 0 0 1 0 1 1 0 0 0 0 0 1 0 1 0 1 0 0 0 1 0 1 0 0
0 0 1 1 1 1 1 0 0 0 1 0 0 1 1 1 1 1 0 0 0 0 0 1 1 1 1 0 1 0 1 1 0 1 0 1 0
0 1 0 1 0 1 0 0 1]
LR Y probability predicted : [1.8037052521016162e-07, 0.9895297982806406, 0.9711635825097306, 2.302874417977372e-12, 1.5407973449278866e-09]
Accuracy Score :
0.9916666666666667
AUC Score :
1.0
Confusion Matrix :
[[71 1]
[ 0 48]]
Classification Report :
precision recall f1-score support
0 1.00 0.99 0.99 72
1 0.98 1.00 0.99 48
accuracy 0.99 120
macro avg 0.99 0.99 0.99 120
weighted avg 0.99 0.99 0.99 120
ROC curve :
LDA Y predicted : [0 1 1 0 0 0 0 1 0 0 0 0 0 1 0 0 1 0 0 0 0 1 0 1 0 1 0 0 1 0 0 1 1 0 0 0 0
0 0 1 0 1 0 0 1 0 1 1 0 0 1 0 1 0 1 1 0 0 0 0 0 1 0 1 0 1 0 0 0 1 0 1 0 0
0 0 1 1 1 1 1 0 0 0 1 0 0 1 1 1 1 1 0 0 0 0 0 1 1 1 1 0 1 0 1 1 0 1 0 1 0
1 1 0 1 0 1 0 0 1]
LDA Y probability predicted : [0.005011724108802211, 0.9952069438611767, 0.9931432694740396, 7.070583461710933e-08, 0.00035047802093480124]
Accuracy Score :
0.9833333333333333
AUC Score :
0.9997106481481481
Confusion Matrix :
[[70 2]
[ 0 48]]
Classification Report :
precision recall f1-score support
0 1.00 0.97 0.99 72
1 0.96 1.00 0.98 48
accuracy 0.98 120
macro avg 0.98 0.99 0.98 120
weighted avg 0.98 0.98 0.98 120
ROC curve :
KNN Y predicted : [0 1 1 0 0 0 0 1 0 0 0 0 0 1 0 0 1 0 0 0 0 1 0 1 0 1 0 0 1 1 0 1 1 0 0 0 0
0 0 1 0 1 0 0 1 0 1 1 0 0 0 0 1 0 1 1 0 0 0 0 0 1 0 1 0 1 0 0 0 1 0 1 0 0
0 0 1 1 1 1 1 0 0 0 1 0 0 1 1 1 1 1 0 0 0 0 0 1 1 1 1 0 1 0 1 1 0 1 0 1 0
1 1 0 1 0 1 0 0 1]
KNN Y probability predicted : [0.0, 1.0, 1.0, 0.0, 0.0]
Accuracy Score :
0.9833333333333333
AUC Score :
0.9930555555555556
Confusion Matrix :
[[70 2]
[ 0 48]]
Classification Report :
precision recall f1-score support
0 1.00 0.97 0.99 72
1 0.96 1.00 0.98 48
accuracy 0.98 120
macro avg 0.98 0.99 0.98 120
weighted avg 0.98 0.98 0.98 120
ROC curve :
CART Y predicted : [0 1 1 0 0 0 0 1 0 0 0 0 0 1 0 0 1 0 0 0 0 1 0 1 0 1 0 0 1 0 0 1 1 0 0 0 0
0 0 1 0 1 0 0 1 0 1 1 0 0 0 0 1 0 1 1 0 0 0 0 0 1 0 1 0 1 0 0 0 1 0 1 0 0
0 0 1 1 1 1 1 0 0 0 1 0 0 1 1 1 1 1 0 0 0 0 0 1 1 1 1 0 1 0 1 1 0 1 0 1 0
0 1 0 0 0 1 0 0 1]
CART Y probability predicted : [0.0, 1.0, 1.0, 0.0, 0.0]
Accuracy Score :
0.9916666666666667
AUC Score :
0.9895833333333333
Confusion Matrix :
[[72 0]
[ 1 47]]
Classification Report :
precision recall f1-score support
0 0.99 1.00 0.99 72
1 1.00 0.98 0.99 48
accuracy 0.99 120
macro avg 0.99 0.99 0.99 120
weighted avg 0.99 0.99 0.99 120
ROC curve :
RFC Y predicted : [0 1 1 0 0 0 0 1 0 0 0 0 0 1 0 0 1 0 0 0 0 1 0 1 0 1 0 0 1 0 0 1 1 0 0 0 0
0 0 1 0 1 0 0 1 0 1 1 0 0 0 0 1 0 1 1 0 0 0 0 0 1 0 1 0 1 0 0 0 1 0 1 0 0
0 0 1 1 1 1 1 0 0 0 1 0 0 1 1 1 1 1 0 0 0 0 0 1 1 1 1 0 1 0 1 1 0 1 0 1 0
0 1 0 1 0 1 0 0 1]
RFC Y probability predicted : [0.0, 0.88, 0.96, 0.0, 0.0]
Accuracy Score :
1.0
AUC Score :
1.0
Confusion Matrix :
[[72 0]
[ 0 48]]
Classification Report :
precision recall f1-score support
0 1.00 1.00 1.00 72
1 1.00 1.00 1.00 48
accuracy 1.00 120
macro avg 1.00 1.00 1.00 120
weighted avg 1.00 1.00 1.00 120
ROC curve :
[21:02:29] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior.
XGB Y predicted : [0 1 1 0 0 0 0 1 0 0 0 0 0 1 0 0 1 0 0 0 0 1 0 1 0 1 0 0 1 0 0 1 1 0 0 0 0
0 0 1 0 1 0 0 1 0 1 1 0 0 0 0 1 0 1 1 0 0 0 0 0 1 0 1 0 1 0 0 0 1 0 1 0 0
0 0 1 1 1 1 1 0 0 0 1 0 0 1 1 1 1 1 0 0 0 0 0 1 1 1 1 0 1 0 1 1 0 1 0 1 0
0 1 0 1 0 1 0 0 1]
XGB Y probability predicted : [0.0022306948, 0.98368555, 0.9958354, 0.0006578047, 0.0016758419]
Accuracy Score :
1.0
AUC Score :
1.0
Confusion Matrix :
[[72 0]
[ 0 48]]
Classification Report :
precision recall f1-score support
0 1.00 1.00 1.00 72
1 1.00 1.00 1.00 48
accuracy 1.00 120
macro avg 1.00 1.00 1.00 120
weighted avg 1.00 1.00 1.00 120
ROC curve :
NB Y predicted : [0 1 1 0 0 0 0 1 0 0 0 0 0 1 0 0 1 0 0 0 0 1 0 1 0 1 0 0 1 1 0 1 1 0 0 0 0
0 0 1 0 1 0 0 1 0 1 1 0 0 0 0 1 0 1 1 0 0 0 0 0 1 0 1 0 1 0 0 0 1 0 1 0 0
0 0 1 1 1 1 1 1 0 0 1 0 0 1 1 1 1 1 0 0 0 0 0 1 1 1 1 0 1 0 1 1 0 1 0 1 0
0 1 0 1 1 1 0 0 1]
NB Y probability predicted : [0.0, 1.0, 1.0, 0.0, 0.0]
Accuracy Score :
0.975
AUC Score :
0.9791666666666667
Confusion Matrix :
[[69 3]
[ 0 48]]
Classification Report :
precision recall f1-score support
0 1.00 0.96 0.98 72
1 0.94 1.00 0.97 48
accuracy 0.97 120
macro avg 0.97 0.98 0.97 120
weighted avg 0.98 0.97 0.98 120
ROC curve :
LGB Y predicted : [0 1 1 0 0 0 0 1 0 0 0 0 0 1 0 0 1 0 0 0 0 1 0 1 0 1 0 0 1 0 0 1 1 0 0 0 0
0 0 1 0 1 0 0 1 0 1 1 0 0 0 0 1 0 1 1 0 0 0 0 0 1 0 1 0 1 0 0 0 1 0 1 0 0
0 0 1 1 1 1 1 0 0 0 1 0 0 1 1 1 1 1 0 0 0 0 0 1 1 1 1 0 1 0 1 1 0 1 0 1 0
0 1 0 1 0 1 0 0 1]
LGB Y probability predicted : [9.707500141044328e-05, 0.9992791342912986, 0.9999327740946744, 1.6024364844630664e-05, 1.5872290897321578e-05]
Accuracy Score :
1.0
AUC Score :
1.0
Confusion Matrix :
[[72 0]
[ 0 48]]
Classification Report :
precision recall f1-score support
0 1.00 1.00 1.00 72
1 1.00 1.00 1.00 48
accuracy 1.00 120
macro avg 1.00 1.00 1.00 120
weighted avg 1.00 1.00 1.00 120
ROC curve :
#alg comp using accuracy results
acc_results = []
acc_names = []
for name, model in models:
kfold = KFold(n_splits=num_folds)
cv_results = cross_val_score(model, X_train, y_train, cv=kfold, scoring='accuracy')
acc_results.append(cv_results)
acc_names.append(name)
msg = "%s,accuracy: %f (%f)" % (name, cv_results.mean(), cv_results.std())
print(msg)
fig = plt.figure()
fig.suptitle('Algorithm Comparison with accuracy')
ax = fig.add_subplot(111)
plt.boxplot(acc_results)
ax.set_xticklabels(acc_names)
plt.show()
LR,accuracy: 1.000000 (0.000000)
LDA,accuracy: 0.953571 (0.027894)
KNN,accuracy: 0.964286 (0.027664)
CART,accuracy: 0.964286 (0.035714)
RFC,accuracy: 0.985714 (0.023690)
[21:02:55] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:02:55] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:02:55] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:02:56] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:02:56] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:02:56] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:02:56] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:02:56] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:02:56] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [21:02:57] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. XGB,accuracy: 0.975000 (0.022868)
NB,accuracy: 0.960714 (0.037287)
LGB,accuracy: 0.978571 (0.017496)
#alg comp using roc results
roc_results = []
roc_names = []
for name, model in models:
kfold = KFold(n_splits=num_folds)
cv_results = cross_val_score(model, X_train, y_train, cv=kfold, scoring='roc_auc')
roc_results.append(cv_results)
roc_names.append(name)
msg = "%s,roc: %f (%f)" % (name, cv_results.mean(), cv_results.std())
print(msg)
fig = plt.figure()
fig.suptitle('Algorithm Comparison with roc')
ax = fig.add_subplot(111)
plt.boxplot(roc_results)
ax.set_xticklabels(roc_names)
plt.show()
LR,roc: 1.000000 (0.000000)
LDA,roc: 0.994787 (0.008848)
KNN,roc: 0.999745 (0.000765)
CART,roc: 0.959598 (0.033796)
RFC,roc: 0.999490 (0.001531)
[22:31:41] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [22:31:41] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [22:31:42] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [22:31:42] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [22:31:42] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [22:31:42] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [22:31:42] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [22:31:43] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [22:31:43] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. [22:31:43] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. XGB,roc: 0.995695 (0.007977)
NB,roc: 0.967389 (0.032010)
LGB,roc: 0.998905 (0.002196)
# RFC feature importance
model_rfc = RandomForestClassifier()
# fit the model
model_rfc.fit(X, y)
# get importance
importance = model_rfc.feature_importances_
# summarize feature importance
for i,v in enumerate(importance):
print( 'RFC Feature: %s, Score: %.5f' % (df.columns[i],v))
# plot feature importance
plt.bar([df.columns[x] for x in range(len(importance))], importance)
plt.xticks(df.columns[:-1], rotation='vertical')
plt.show()
RFC Feature: age, Score: 0.00881 RFC Feature: blood_pressure, Score: 0.01147 RFC Feature: specific_gravity, Score: 0.08562 RFC Feature: albumin, Score: 0.06241 RFC Feature: sugar, Score: 0.00588 RFC Feature: red_blood_cells, Score: 0.00402 RFC Feature: pus_cell, Score: 0.00325 RFC Feature: pus_cell_clumps, Score: 0.00010 RFC Feature: bacteria, Score: 0.00068 RFC Feature: blood_glucose_random, Score: 0.03920 RFC Feature: blood_urea, Score: 0.03744 RFC Feature: serum_creatinine, Score: 0.17553 RFC Feature: sodium, Score: 0.03342 RFC Feature: potassium, Score: 0.00877 RFC Feature: haemoglobin, Score: 0.19096 RFC Feature: packed_cell_volume, Score: 0.13670 RFC Feature: white_blood_cell_count, Score: 0.00324 RFC Feature: red_blood_cell_count, Score: 0.08443 RFC Feature: ypertension, Score: 0.04251 RFC Feature: diabetes_mellitus, Score: 0.04873 RFC Feature: coronary_artery_disease, Score: 0.00000 RFC Feature: appetite, Score: 0.00597 RFC Feature: pedal_edema, Score: 0.00727 RFC Feature: anemia, Score: 0.00358
# Split the dataset into the training set and test set after cat col encoder
no_scale_X_train, no_scale_X_test, no_scale_y_train, no_scale_y_test = train_test_split(X, y, test_size = 0.3, random_state = 0)
# feature selection
def select_features(no_scale_X_train, no_scale_y_train):
fs = SelectKBest(score_func=chi2, k= 20)
ordered_feature=fs.fit(no_scale_X_train, no_scale_y_train)
return ordered_feature
ordered_feature=select_features(no_scale_X_train, no_scale_y_train)
no_scale_X_train
| age | blood_pressure | specific_gravity | albumin | sugar | red_blood_cells | pus_cell | pus_cell_clumps | bacteria | blood_glucose_random | blood_urea | serum_creatinine | sodium | potassium | haemoglobin | packed_cell_volume | white_blood_cell_count | red_blood_cell_count | ypertension | diabetes_mellitus | coronary_artery_disease | appetite | pedal_edema | anemia | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 92 | 71.0 | 70.0 | 1.010 | 3.0 | 0.0 | 1 | 0 | 1 | 1 | 219.0 | 82.0 | 3.60 | 133.0 | 4.4 | 10.40 | 33.0 | 5600.0 | 3.6 | 1 | 1 | 1 | 0 | 0 | 0 |
| 223 | 71.0 | 90.0 | 1.010 | 0.0 | 3.0 | 1 | 1 | 0 | 0 | 303.0 | 30.0 | 1.30 | 136.0 | 4.1 | 13.00 | 38.0 | 9200.0 | 4.6 | 1 | 1 | 0 | 0 | 0 | 0 |
| 234 | 37.0 | 100.0 | 1.010 | 0.0 | 0.0 | 0 | 1 | 0 | 0 | 121.0 | 19.0 | 1.30 | 138.0 | 4.4 | 15.00 | 44.0 | 4100.0 | 5.2 | 1 | 0 | 0 | 0 | 0 | 0 |
| 232 | 50.0 | 90.0 | 1.015 | 1.0 | 0.0 | 0 | 0 | 0 | 0 | 121.0 | 42.0 | 1.30 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 1 | 0 |
| 377 | 64.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 97.0 | 27.0 | 0.70 | 145.0 | 4.8 | 13.80 | 49.0 | 6400.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 142 | 72.0 | 90.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 84.0 | 145.0 | 7.10 | 135.0 | 5.3 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 1 | 0 | 0 | 0 | 0 |
| 22 | 48.0 | 80.0 | 1.025 | 4.0 | 0.0 | 1 | 0 | 0 | 0 | 95.0 | 163.0 | 7.70 | 136.0 | 3.8 | 9.80 | 32.0 | 6900.0 | 3.4 | 1 | 0 | 0 | 0 | 0 | 1 |
| 252 | 45.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 82.0 | 49.0 | 0.60 | 147.0 | 4.4 | 15.90 | 46.0 | 9100.0 | 4.7 | 0 | 0 | 0 | 0 | 0 | 0 |
| 350 | 65.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 85.0 | 20.0 | 1.00 | 142.0 | 4.8 | 16.10 | 43.0 | 9600.0 | 4.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 168 | 65.0 | 70.0 | 1.015 | 4.0 | 4.0 | 1 | 1 | 1 | 0 | 307.0 | 28.0 | 1.50 | 138.0 | 4.4 | 11.00 | 39.0 | 6700.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 150 | 8.0 | 60.0 | 1.025 | 3.0 | 0.0 | 1 | 1 | 0 | 0 | 78.0 | 27.0 | 0.90 | 138.0 | 4.4 | 12.30 | 41.0 | 6700.0 | 4.8 | 0 | 0 | 0 | 1 | 1 | 0 |
| 393 | 43.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 117.0 | 45.0 | 0.70 | 141.0 | 4.4 | 13.00 | 54.0 | 7400.0 | 5.4 | 0 | 0 | 0 | 0 | 0 | 0 |
| 66 | 67.0 | 70.0 | 1.020 | 2.0 | 0.0 | 0 | 1 | 0 | 0 | 150.0 | 55.0 | 1.60 | 131.0 | 4.8 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 1 | 0 |
| 240 | 65.0 | 70.0 | 1.015 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 203.0 | 46.0 | 1.40 | 138.0 | 4.4 | 11.40 | 36.0 | 5000.0 | 4.1 | 1 | 1 | 0 | 1 | 1 | 0 |
| 218 | 33.0 | 90.0 | 1.015 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 92.0 | 19.0 | 0.80 | 138.0 | 4.4 | 11.80 | 34.0 | 7000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 101 | 71.0 | 90.0 | 1.015 | 2.0 | 0.0 | 1 | 0 | 1 | 1 | 88.0 | 80.0 | 4.40 | 139.0 | 5.7 | 11.30 | 33.0 | 10700.0 | 3.9 | 0 | 0 | 0 | 0 | 0 | 0 |
| 311 | 56.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 132.0 | 18.0 | 1.10 | 147.0 | 4.7 | 13.70 | 45.0 | 7500.0 | 5.6 | 0 | 0 | 0 | 0 | 0 | 0 |
| 194 | 80.0 | 70.0 | 1.010 | 2.0 | 0.0 | 1 | 0 | 0 | 0 | 121.0 | 49.0 | 1.20 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 326 | 47.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 109.0 | 25.0 | 1.10 | 141.0 | 4.7 | 15.80 | 41.0 | 8300.0 | 5.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 17 | 47.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 114.0 | 87.0 | 5.20 | 139.0 | 3.7 | 12.10 | 40.0 | 8000.0 | 4.8 | 1 | 0 | 0 | 1 | 0 | 0 |
| 164 | 14.0 | 80.0 | 1.015 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 192.0 | 15.0 | 0.80 | 137.0 | 4.2 | 14.30 | 40.0 | 9500.0 | 5.4 | 0 | 1 | 0 | 1 | 1 | 0 |
| 186 | 8.0 | 50.0 | 1.020 | 4.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 46.0 | 1.00 | 135.0 | 3.8 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 1 | 0 |
| 30 | 55.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 93.0 | 155.0 | 7.30 | 132.0 | 4.9 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 114 | 12.0 | 60.0 | 1.015 | 3.0 | 0.0 | 0 | 0 | 1 | 0 | 121.0 | 51.0 | 1.80 | 138.0 | 4.4 | 12.10 | 40.0 | 10300.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 263 | 45.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 117.0 | 46.0 | 1.20 | 137.0 | 5.0 | 16.20 | 45.0 | 8600.0 | 5.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 103 | 76.0 | 70.0 | 1.015 | 2.0 | 0.0 | 1 | 0 | 1 | 0 | 226.0 | 217.0 | 10.20 | 138.0 | 4.4 | 10.20 | 36.0 | 12700.0 | 4.2 | 1 | 0 | 0 | 1 | 1 | 1 |
| 358 | 47.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 117.0 | 22.0 | 1.20 | 138.0 | 3.5 | 13.00 | 45.0 | 5200.0 | 5.6 | 0 | 0 | 0 | 0 | 0 | 0 |
| 245 | 48.0 | 100.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 103.0 | 79.0 | 5.30 | 135.0 | 6.3 | 6.30 | 19.0 | 7200.0 | 2.6 | 1 | 0 | 1 | 1 | 0 | 0 |
| 235 | 45.0 | 70.0 | 1.010 | 2.0 | 0.0 | 1 | 1 | 0 | 0 | 113.0 | 93.0 | 2.30 | 138.0 | 4.4 | 7.90 | 26.0 | 5700.0 | 4.8 | 0 | 0 | 1 | 0 | 0 | 1 |
| 116 | 55.0 | 70.0 | 1.015 | 4.0 | 0.0 | 0 | 1 | 0 | 0 | 104.0 | 16.0 | 0.50 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 1 | 0 |
| 330 | 43.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 114.0 | 32.0 | 1.10 | 135.0 | 3.9 | 12.65 | 42.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 120 | 72.0 | 90.0 | 1.025 | 1.0 | 3.0 | 1 | 1 | 0 | 0 | 323.0 | 40.0 | 2.20 | 137.0 | 5.3 | 12.60 | 40.0 | 8000.0 | 4.8 | 0 | 1 | 1 | 1 | 0 | 0 |
| 289 | 42.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 93.0 | 32.0 | 0.90 | 143.0 | 4.7 | 16.60 | 43.0 | 7100.0 | 5.3 | 0 | 0 | 0 | 0 | 0 | 0 |
| 112 | 55.0 | 60.0 | 1.015 | 3.0 | 0.0 | 0 | 0 | 0 | 0 | 121.0 | 34.0 | 1.20 | 138.0 | 4.4 | 10.80 | 33.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 215 | 2.0 | 80.0 | 1.010 | 3.0 | 0.0 | 1 | 0 | 0 | 0 | 121.0 | 42.0 | 1.30 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 1 | 0 |
| 136 | 46.0 | 90.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 213.0 | 68.0 | 2.80 | 146.0 | 6.3 | 9.30 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 275 | 52.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 125.0 | 22.0 | 1.20 | 139.0 | 4.6 | 16.50 | 43.0 | 4700.0 | 4.6 | 0 | 0 | 0 | 0 | 0 | 0 |
| 126 | 70.0 | 90.0 | 1.015 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 144.0 | 125.0 | 4.00 | 136.0 | 4.6 | 12.00 | 37.0 | 8200.0 | 4.5 | 1 | 1 | 0 | 1 | 1 | 0 |
| 198 | 59.0 | 100.0 | 1.020 | 4.0 | 2.0 | 1 | 1 | 0 | 0 | 252.0 | 40.0 | 3.20 | 137.0 | 4.7 | 11.20 | 30.0 | 26400.0 | 3.9 | 1 | 1 | 0 | 1 | 1 | 0 |
| 299 | 73.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 127.0 | 48.0 | 0.50 | 150.0 | 3.5 | 15.10 | 52.0 | 11000.0 | 4.7 | 0 | 0 | 0 | 0 | 0 | 0 |
| 281 | 55.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 130.0 | 50.0 | 1.20 | 147.0 | 5.0 | 15.50 | 41.0 | 9100.0 | 6.0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 133 | 70.0 | 100.0 | 1.015 | 4.0 | 0.0 | 1 | 1 | 0 | 0 | 118.0 | 125.0 | 5.30 | 136.0 | 4.9 | 12.00 | 37.0 | 8400.0 | 8.0 | 1 | 0 | 0 | 0 | 0 | 0 |
| 33 | 60.0 | 100.0 | 1.020 | 2.0 | 0.0 | 0 | 0 | 0 | 0 | 140.0 | 55.0 | 2.50 | 138.0 | 4.4 | 10.10 | 29.0 | 8000.0 | 4.8 | 1 | 0 | 0 | 1 | 0 | 0 |
| 378 | 71.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 42.0 | 0.90 | 140.0 | 4.8 | 15.20 | 42.0 | 7700.0 | 5.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 162 | 59.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 204.0 | 34.0 | 1.50 | 124.0 | 4.1 | 9.80 | 37.0 | 6000.0 | 4.8 | 0 | 1 | 0 | 0 | 0 | 0 |
| 34 | 70.0 | 70.0 | 1.010 | 1.0 | 0.0 | 1 | 1 | 1 | 1 | 171.0 | 153.0 | 5.20 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 1 | 0 | 1 | 0 | 0 |
| 231 | 60.0 | 90.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 269.0 | 51.0 | 2.80 | 138.0 | 3.7 | 11.50 | 35.0 | 8000.0 | 4.8 | 1 | 1 | 1 | 0 | 1 | 0 |
| 97 | 65.0 | 60.0 | 1.015 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 91.0 | 51.0 | 2.20 | 132.0 | 3.8 | 10.00 | 32.0 | 9100.0 | 4.0 | 1 | 1 | 0 | 1 | 1 | 0 |
| 85 | 70.0 | 70.0 | 1.015 | 2.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 46.0 | 1.50 | 138.0 | 4.4 | 9.90 | 40.0 | 8000.0 | 4.8 | 0 | 1 | 0 | 1 | 1 | 0 |
| 61 | 67.0 | 80.0 | 1.010 | 1.0 | 3.0 | 1 | 0 | 0 | 0 | 182.0 | 391.0 | 32.00 | 163.0 | 39.0 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 1 | 0 |
| 167 | 34.0 | 70.0 | 1.020 | 0.0 | 0.0 | 0 | 1 | 0 | 0 | 139.0 | 19.0 | 0.90 | 138.0 | 4.4 | 12.70 | 42.0 | 2200.0 | 4.8 | 0 | 0 | 0 | 1 | 0 | 0 |
| 282 | 20.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 123.0 | 44.0 | 1.00 | 135.0 | 3.8 | 14.60 | 44.0 | 5500.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 200 | 90.0 | 90.0 | 1.025 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 139.0 | 89.0 | 3.00 | 140.0 | 4.1 | 12.00 | 37.0 | 7900.0 | 3.9 | 1 | 1 | 0 | 0 | 0 | 0 |
| 391 | 36.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 85.0 | 16.0 | 1.10 | 142.0 | 4.1 | 15.60 | 44.0 | 5800.0 | 6.3 | 0 | 0 | 0 | 0 | 0 | 0 |
| 230 | 65.0 | 60.0 | 1.010 | 2.0 | 0.0 | 1 | 0 | 1 | 0 | 192.0 | 17.0 | 1.70 | 130.0 | 4.3 | 12.65 | 40.0 | 9500.0 | 4.8 | 1 | 1 | 0 | 1 | 0 | 0 |
| 287 | 39.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 124.0 | 22.0 | 0.60 | 137.0 | 3.8 | 13.40 | 43.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 108 | 45.0 | 80.0 | 1.015 | 0.0 | 0.0 | 1 | 0 | 0 | 0 | 107.0 | 15.0 | 1.00 | 141.0 | 4.2 | 11.80 | 37.0 | 10200.0 | 4.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 46 | 48.0 | 70.0 | 1.015 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 124.0 | 24.0 | 1.20 | 142.0 | 4.2 | 12.40 | 37.0 | 6400.0 | 4.7 | 0 | 1 | 0 | 0 | 0 | 0 |
| 320 | 57.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 105.0 | 49.0 | 1.20 | 150.0 | 4.7 | 15.70 | 44.0 | 10400.0 | 6.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 396 | 42.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 75.0 | 31.0 | 1.20 | 141.0 | 3.5 | 16.50 | 54.0 | 7800.0 | 6.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 224 | 34.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 117.0 | 28.0 | 2.20 | 138.0 | 3.8 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 1 | 0 |
| 73 | 55.0 | 100.0 | 1.015 | 2.0 | 0.0 | 0 | 0 | 0 | 0 | 129.0 | 107.0 | 6.70 | 132.0 | 4.4 | 4.80 | 14.0 | 6300.0 | 4.8 | 1 | 0 | 0 | 0 | 1 | 1 |
| 137 | 45.0 | 60.0 | 1.010 | 2.0 | 0.0 | 1 | 0 | 1 | 0 | 268.0 | 86.0 | 4.00 | 134.0 | 5.1 | 10.00 | 29.0 | 9200.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 381 | 71.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 79.0 | 47.0 | 0.50 | 142.0 | 4.8 | 16.60 | 40.0 | 5800.0 | 5.9 | 0 | 0 | 0 | 0 | 0 | 0 |
| 220 | 36.0 | 80.0 | 1.010 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 103.0 | 42.0 | 1.30 | 138.0 | 4.4 | 11.90 | 36.0 | 8800.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 210 | 59.0 | 100.0 | 1.015 | 4.0 | 2.0 | 1 | 1 | 0 | 0 | 255.0 | 132.0 | 12.80 | 135.0 | 5.7 | 7.30 | 20.0 | 9800.0 | 3.9 | 1 | 1 | 1 | 0 | 0 | 1 |
| 29 | 68.0 | 70.0 | 1.005 | 1.0 | 0.0 | 0 | 0 | 1 | 0 | 121.0 | 28.0 | 1.40 | 138.0 | 4.4 | 12.90 | 38.0 | 8000.0 | 4.8 | 0 | 0 | 1 | 0 | 0 | 0 |
| 181 | 45.0 | 70.0 | 1.025 | 2.0 | 0.0 | 1 | 0 | 1 | 0 | 117.0 | 52.0 | 2.20 | 136.0 | 3.8 | 10.00 | 30.0 | 19100.0 | 3.7 | 0 | 0 | 0 | 0 | 0 | 0 |
| 360 | 35.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 105.0 | 39.0 | 0.50 | 135.0 | 3.9 | 14.70 | 43.0 | 5800.0 | 6.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 271 | 30.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 96.0 | 25.0 | 0.50 | 144.0 | 4.8 | 13.80 | 42.0 | 9000.0 | 4.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 51 | 54.0 | 100.0 | 1.015 | 3.0 | 0.0 | 1 | 1 | 1 | 0 | 162.0 | 66.0 | 1.60 | 136.0 | 4.4 | 10.30 | 33.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 1 | 1 | 0 |
| 328 | 28.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 131.0 | 29.0 | 0.60 | 145.0 | 4.9 | 12.65 | 45.0 | 8600.0 | 6.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 352 | 37.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 109.0 | 47.0 | 1.10 | 141.0 | 4.9 | 15.00 | 48.0 | 7000.0 | 5.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 27 | 69.0 | 70.0 | 1.010 | 3.0 | 4.0 | 1 | 0 | 0 | 0 | 264.0 | 87.0 | 2.70 | 130.0 | 4.0 | 12.50 | 37.0 | 9600.0 | 4.1 | 1 | 1 | 1 | 0 | 1 | 0 |
| 2 | 62.0 | 80.0 | 1.010 | 2.0 | 3.0 | 1 | 1 | 0 | 0 | 423.0 | 53.0 | 1.80 | 138.0 | 4.4 | 9.60 | 31.0 | 7500.0 | 4.8 | 0 | 1 | 0 | 1 | 0 | 1 |
| 217 | 63.0 | 100.0 | 1.010 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 78.0 | 61.0 | 1.80 | 141.0 | 4.4 | 12.20 | 36.0 | 10500.0 | 4.3 | 0 | 1 | 0 | 0 | 0 | 0 |
| 156 | 66.0 | 90.0 | 1.015 | 2.0 | 0.0 | 1 | 1 | 0 | 1 | 153.0 | 76.0 | 3.30 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 1 | 0 | 0 |
| 212 | 40.0 | 70.0 | 1.015 | 3.0 | 4.0 | 1 | 1 | 0 | 0 | 253.0 | 150.0 | 11.90 | 132.0 | 5.6 | 10.90 | 31.0 | 8800.0 | 3.4 | 1 | 1 | 0 | 1 | 1 | 0 |
| 376 | 58.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 88.0 | 16.0 | 1.10 | 147.0 | 3.5 | 16.40 | 53.0 | 9100.0 | 5.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 221 | 66.0 | 70.0 | 1.020 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 248.0 | 30.0 | 1.70 | 138.0 | 5.3 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 138 | 73.0 | 80.0 | 1.010 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 95.0 | 51.0 | 1.60 | 142.0 | 3.5 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 236 | 65.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 74.0 | 66.0 | 2.00 | 136.0 | 5.4 | 9.10 | 25.0 | 8000.0 | 4.8 | 1 | 1 | 1 | 0 | 1 | 0 |
| 219 | 68.0 | 90.0 | 1.010 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 238.0 | 57.0 | 2.50 | 138.0 | 4.4 | 9.80 | 28.0 | 8000.0 | 3.3 | 1 | 1 | 0 | 1 | 0 | 0 |
| 274 | 19.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 107.0 | 23.0 | 0.70 | 141.0 | 4.2 | 14.40 | 44.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 278 | 48.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 112.0 | 44.0 | 1.20 | 142.0 | 4.9 | 14.50 | 44.0 | 9400.0 | 6.4 | 0 | 0 | 0 | 0 | 0 | 0 |
| 307 | 47.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 137.0 | 17.0 | 0.50 | 150.0 | 3.5 | 13.60 | 44.0 | 7900.0 | 4.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 239 | 34.0 | 90.0 | 1.015 | 2.0 | 0.0 | 1 | 1 | 0 | 0 | 104.0 | 50.0 | 1.60 | 137.0 | 4.1 | 11.90 | 39.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 35 | 65.0 | 90.0 | 1.020 | 2.0 | 1.0 | 0 | 1 | 0 | 0 | 270.0 | 39.0 | 2.00 | 138.0 | 4.4 | 12.00 | 36.0 | 9800.0 | 4.9 | 1 | 1 | 0 | 1 | 0 | 1 |
| 204 | 65.0 | 90.0 | 1.010 | 4.0 | 2.0 | 1 | 1 | 0 | 0 | 172.0 | 82.0 | 13.50 | 145.0 | 6.3 | 8.80 | 31.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 1 | 1 |
| 392 | 57.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 133.0 | 48.0 | 1.20 | 147.0 | 4.3 | 14.80 | 46.0 | 6600.0 | 5.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 67 | 45.0 | 80.0 | 1.020 | 3.0 | 0.0 | 1 | 0 | 0 | 0 | 425.0 | 42.0 | 1.30 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 1 | 0 | 0 |
| 24 | 42.0 | 100.0 | 1.015 | 4.0 | 0.0 | 1 | 0 | 0 | 1 | 121.0 | 50.0 | 1.40 | 129.0 | 4.0 | 11.10 | 39.0 | 8300.0 | 4.6 | 1 | 0 | 0 | 1 | 0 | 0 |
| 332 | 34.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 33.0 | 1.00 | 150.0 | 5.0 | 15.30 | 44.0 | 10500.0 | 6.1 | 0 | 0 | 0 | 0 | 0 | 0 |
| 44 | 54.0 | 80.0 | 1.010 | 3.0 | 0.0 | 0 | 0 | 0 | 0 | 207.0 | 77.0 | 6.30 | 134.0 | 4.8 | 9.70 | 28.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 1 | 1 | 0 |
| 241 | 57.0 | 70.0 | 1.015 | 1.0 | 0.0 | 1 | 0 | 0 | 0 | 165.0 | 45.0 | 1.50 | 140.0 | 3.3 | 10.40 | 31.0 | 4200.0 | 3.9 | 0 | 0 | 0 | 0 | 0 | 0 |
| 129 | 75.0 | 70.0 | 1.025 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 158.0 | 49.0 | 1.40 | 135.0 | 4.7 | 11.10 | 40.0 | 8000.0 | 4.8 | 1 | 0 | 0 | 1 | 1 | 0 |
| 93 | 73.0 | 100.0 | 1.010 | 3.0 | 2.0 | 0 | 0 | 1 | 0 | 295.0 | 90.0 | 5.60 | 140.0 | 2.9 | 9.20 | 30.0 | 7000.0 | 3.2 | 1 | 1 | 1 | 1 | 0 | 0 |
| 111 | 65.0 | 80.0 | 1.010 | 3.0 | 3.0 | 1 | 1 | 0 | 0 | 294.0 | 71.0 | 4.40 | 128.0 | 5.4 | 10.00 | 32.0 | 9000.0 | 3.9 | 1 | 1 | 1 | 0 | 0 | 0 |
| 166 | 27.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 76.0 | 44.0 | 3.90 | 127.0 | 4.3 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 1 | 1 | 1 |
| 389 | 41.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 112.0 | 48.0 | 0.70 | 140.0 | 5.0 | 17.00 | 52.0 | 7200.0 | 5.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 383 | 80.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 119.0 | 46.0 | 0.70 | 141.0 | 4.9 | 13.90 | 49.0 | 5100.0 | 5.0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 342 | 44.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 95.0 | 46.0 | 0.50 | 138.0 | 4.2 | 15.00 | 50.0 | 7700.0 | 6.3 | 0 | 0 | 0 | 0 | 0 | 0 |
| 40 | 46.0 | 90.0 | 1.010 | 2.0 | 0.0 | 1 | 0 | 0 | 0 | 99.0 | 80.0 | 2.10 | 138.0 | 4.4 | 11.10 | 32.0 | 9100.0 | 4.1 | 1 | 0 | 0 | 0 | 0 | 0 |
| 18 | 60.0 | 100.0 | 1.025 | 0.0 | 3.0 | 1 | 1 | 0 | 0 | 263.0 | 27.0 | 1.30 | 135.0 | 4.3 | 12.70 | 37.0 | 11400.0 | 4.3 | 1 | 1 | 1 | 0 | 0 | 0 |
| 284 | 33.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 100.0 | 37.0 | 1.20 | 142.0 | 4.0 | 16.90 | 52.0 | 6700.0 | 6.0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 79 | 56.0 | 80.0 | 1.010 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 165.0 | 55.0 | 1.80 | 138.0 | 4.4 | 13.50 | 40.0 | 11800.0 | 5.0 | 1 | 1 | 0 | 1 | 1 | 0 |
| 249 | 56.0 | 90.0 | 1.010 | 4.0 | 1.0 | 1 | 0 | 1 | 0 | 176.0 | 309.0 | 13.30 | 124.0 | 6.5 | 3.10 | 9.0 | 5400.0 | 2.1 | 1 | 1 | 0 | 1 | 1 | 1 |
| 394 | 50.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 137.0 | 46.0 | 0.80 | 139.0 | 5.0 | 14.10 | 45.0 | 9500.0 | 4.6 | 0 | 0 | 0 | 0 | 0 | 0 |
| 71 | 46.0 | 60.0 | 1.010 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 163.0 | 92.0 | 3.30 | 141.0 | 4.0 | 9.80 | 28.0 | 14600.0 | 3.2 | 1 | 1 | 0 | 0 | 0 | 0 |
| 13 | 68.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 98.0 | 86.0 | 4.60 | 135.0 | 3.4 | 9.80 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 1 | 1 | 1 | 0 |
| 367 | 68.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 125.0 | 41.0 | 1.10 | 139.0 | 3.8 | 17.40 | 50.0 | 6700.0 | 6.1 | 0 | 0 | 0 | 0 | 0 | 0 |
| 213 | 55.0 | 80.0 | 1.010 | 3.0 | 1.0 | 1 | 0 | 1 | 1 | 214.0 | 73.0 | 3.90 | 137.0 | 4.9 | 10.90 | 34.0 | 7400.0 | 3.7 | 1 | 1 | 0 | 0 | 1 | 0 |
| 385 | 63.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 113.0 | 25.0 | 0.60 | 146.0 | 4.9 | 16.50 | 52.0 | 8000.0 | 5.1 | 0 | 0 | 0 | 0 | 0 | 0 |
| 388 | 51.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 94.0 | 15.0 | 1.20 | 144.0 | 3.7 | 15.50 | 46.0 | 9500.0 | 6.4 | 0 | 0 | 0 | 0 | 0 | 0 |
| 228 | 60.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 124.0 | 52.0 | 2.50 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 0 | 0 | 0 | 0 | 0 |
| 160 | 81.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 148.0 | 39.0 | 2.10 | 147.0 | 4.2 | 10.90 | 35.0 | 9400.0 | 2.4 | 1 | 1 | 1 | 1 | 1 | 0 |
| 104 | 55.0 | 90.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 143.0 | 88.0 | 2.00 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 1 | 1 | 0 |
| 161 | 62.0 | 80.0 | 1.015 | 3.0 | 0.0 | 0 | 1 | 0 | 0 | 121.0 | 42.0 | 1.30 | 138.0 | 4.4 | 14.30 | 42.0 | 10200.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 83 | 48.0 | 70.0 | 1.015 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 127.0 | 19.0 | 1.00 | 134.0 | 3.6 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 189 | 64.0 | 60.0 | 1.010 | 4.0 | 1.0 | 0 | 0 | 0 | 1 | 239.0 | 58.0 | 4.30 | 137.0 | 5.4 | 9.50 | 29.0 | 7500.0 | 3.4 | 1 | 1 | 0 | 1 | 1 | 0 |
| 397 | 12.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 100.0 | 26.0 | 0.60 | 137.0 | 4.4 | 15.80 | 49.0 | 6600.0 | 5.4 | 0 | 0 | 0 | 0 | 0 | 0 |
| 118 | 55.0 | 70.0 | 1.010 | 3.0 | 0.0 | 1 | 1 | 0 | 0 | 99.0 | 25.0 | 1.20 | 138.0 | 4.4 | 11.40 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 1 | 1 | 0 |
| 254 | 51.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 99.0 | 38.0 | 0.80 | 135.0 | 3.7 | 13.00 | 49.0 | 8300.0 | 5.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 188 | 8.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 80.0 | 66.0 | 2.50 | 142.0 | 3.6 | 12.20 | 38.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 208 | 67.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 341.0 | 37.0 | 1.50 | 138.0 | 4.4 | 12.30 | 41.0 | 6900.0 | 4.9 | 1 | 1 | 0 | 0 | 0 | 1 |
| 375 | 70.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 74.0 | 41.0 | 0.50 | 143.0 | 4.5 | 15.10 | 48.0 | 9700.0 | 5.6 | 0 | 0 | 0 | 0 | 0 | 0 |
| 110 | 63.0 | 90.0 | 1.015 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 123.0 | 19.0 | 2.00 | 142.0 | 3.8 | 11.70 | 34.0 | 11400.0 | 4.7 | 0 | 0 | 0 | 0 | 0 | 0 |
| 149 | 65.0 | 70.0 | 1.020 | 1.0 | 0.0 | 0 | 0 | 0 | 0 | 139.0 | 29.0 | 1.00 | 138.0 | 4.4 | 10.50 | 32.0 | 8000.0 | 4.8 | 1 | 0 | 0 | 0 | 1 | 0 |
| 157 | 62.0 | 70.0 | 1.025 | 3.0 | 0.0 | 1 | 0 | 0 | 0 | 122.0 | 42.0 | 1.70 | 136.0 | 4.7 | 12.60 | 39.0 | 7900.0 | 3.9 | 1 | 1 | 0 | 0 | 0 | 0 |
| 152 | 39.0 | 70.0 | 1.010 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 20.0 | 0.80 | 133.0 | 3.5 | 10.90 | 32.0 | 8000.0 | 4.8 | 0 | 1 | 0 | 0 | 0 | 0 |
| 16 | 47.0 | 70.0 | 1.015 | 2.0 | 0.0 | 1 | 1 | 0 | 0 | 99.0 | 46.0 | 2.20 | 138.0 | 4.1 | 12.60 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 269 | 25.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 19.0 | 1.20 | 142.0 | 4.9 | 15.00 | 48.0 | 6900.0 | 5.3 | 0 | 0 | 0 | 0 | 0 | 0 |
| 75 | 5.0 | 80.0 | 1.015 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 16.0 | 0.70 | 138.0 | 3.2 | 8.10 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 1 |
| 109 | 54.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 233.0 | 50.1 | 1.90 | 138.0 | 4.4 | 11.70 | 40.0 | 8000.0 | 4.8 | 0 | 1 | 0 | 0 | 0 | 0 |
| 327 | 30.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 120.0 | 31.0 | 0.80 | 150.0 | 4.6 | 13.40 | 44.0 | 10700.0 | 5.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 205 | 61.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 100.0 | 28.0 | 2.10 | 138.0 | 4.4 | 12.60 | 43.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 315 | 44.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 42.0 | 1.30 | 138.0 | 4.4 | 13.80 | 48.0 | 7800.0 | 4.4 | 0 | 0 | 0 | 0 | 0 | 0 |
| 139 | 41.0 | 70.0 | 1.015 | 2.0 | 0.0 | 1 | 0 | 0 | 1 | 121.0 | 68.0 | 2.80 | 132.0 | 4.1 | 11.10 | 33.0 | 8000.0 | 4.8 | 1 | 0 | 0 | 0 | 1 | 1 |
| 237 | 80.0 | 70.0 | 1.015 | 2.0 | 2.0 | 1 | 1 | 0 | 0 | 141.0 | 53.0 | 2.20 | 138.0 | 4.4 | 12.70 | 40.0 | 9600.0 | 4.8 | 1 | 1 | 0 | 1 | 1 | 0 |
| 319 | 30.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 138.0 | 15.0 | 1.10 | 135.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 248 | 59.0 | 70.0 | 1.010 | 1.0 | 3.0 | 0 | 0 | 0 | 0 | 424.0 | 55.0 | 1.70 | 138.0 | 4.5 | 12.60 | 37.0 | 10200.0 | 4.1 | 1 | 1 | 1 | 0 | 0 | 0 |
| 308 | 43.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 81.0 | 46.0 | 0.60 | 135.0 | 4.9 | 13.90 | 48.0 | 6900.0 | 4.9 | 0 | 0 | 0 | 0 | 0 | 0 |
| 19 | 62.0 | 60.0 | 1.015 | 1.0 | 0.0 | 1 | 0 | 1 | 0 | 100.0 | 31.0 | 1.60 | 138.0 | 4.4 | 10.30 | 30.0 | 5300.0 | 3.7 | 1 | 0 | 1 | 0 | 0 | 0 |
| 226 | 64.0 | 100.0 | 1.015 | 4.0 | 2.0 | 0 | 0 | 0 | 1 | 163.0 | 54.0 | 7.20 | 140.0 | 4.6 | 7.90 | 26.0 | 7500.0 | 3.4 | 1 | 1 | 0 | 0 | 1 | 0 |
| 306 | 52.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 128.0 | 30.0 | 1.20 | 140.0 | 4.5 | 15.20 | 52.0 | 4300.0 | 5.7 | 0 | 0 | 0 | 0 | 0 | 0 |
| 3 | 48.0 | 70.0 | 1.005 | 4.0 | 0.0 | 1 | 0 | 1 | 0 | 117.0 | 56.0 | 3.80 | 111.0 | 2.5 | 11.20 | 32.0 | 6700.0 | 3.9 | 1 | 0 | 0 | 1 | 1 | 1 |
| 276 | 20.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 42.0 | 1.30 | 137.0 | 4.7 | 14.00 | 41.0 | 4500.0 | 5.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 125 | 72.0 | 90.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 308.0 | 36.0 | 2.50 | 131.0 | 4.3 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 1 | 0 | 0 |
| 77 | 67.0 | 70.0 | 1.010 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 102.0 | 48.0 | 3.20 | 137.0 | 5.0 | 11.90 | 34.0 | 7100.0 | 3.7 | 1 | 1 | 0 | 0 | 1 | 0 |
| 184 | 54.0 | 60.0 | 1.015 | 3.0 | 2.0 | 1 | 0 | 0 | 0 | 352.0 | 137.0 | 3.30 | 133.0 | 4.5 | 11.30 | 31.0 | 5800.0 | 3.6 | 1 | 1 | 1 | 1 | 1 | 0 |
| 301 | 44.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 96.0 | 33.0 | 0.90 | 147.0 | 4.5 | 16.90 | 41.0 | 7200.0 | 5.0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 379 | 62.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 78.0 | 45.0 | 0.60 | 138.0 | 3.5 | 16.10 | 50.0 | 5400.0 | 5.7 | 0 | 0 | 0 | 0 | 0 | 0 |
| 346 | 33.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 130.0 | 41.0 | 0.90 | 141.0 | 4.4 | 15.50 | 52.0 | 4300.0 | 5.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 182 | 61.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 131.0 | 23.0 | 0.80 | 140.0 | 4.1 | 11.30 | 35.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 356 | 34.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 87.0 | 38.0 | 0.50 | 144.0 | 4.8 | 17.10 | 47.0 | 7400.0 | 6.1 | 0 | 0 | 0 | 0 | 0 | 0 |
| 80 | 74.0 | 80.0 | 1.010 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 132.0 | 98.0 | 2.80 | 133.0 | 5.0 | 10.80 | 31.0 | 9400.0 | 3.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 258 | 42.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 98.0 | 20.0 | 0.50 | 140.0 | 3.5 | 13.90 | 44.0 | 8400.0 | 5.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 11 | 63.0 | 70.0 | 1.010 | 3.0 | 0.0 | 0 | 0 | 1 | 0 | 380.0 | 60.0 | 2.70 | 131.0 | 4.2 | 10.80 | 32.0 | 4500.0 | 3.8 | 1 | 1 | 0 | 1 | 1 | 0 |
| 298 | 34.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 91.0 | 49.0 | 1.20 | 135.0 | 4.5 | 13.50 | 48.0 | 8600.0 | 4.9 | 0 | 0 | 0 | 0 | 0 | 0 |
| 86 | 56.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 415.0 | 37.0 | 1.90 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 1 | 0 | 0 | 0 | 0 |
| 266 | 55.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 133.0 | 17.0 | 1.20 | 135.0 | 4.8 | 13.20 | 41.0 | 6800.0 | 5.3 | 0 | 0 | 0 | 0 | 0 | 0 |
| 36 | 76.0 | 70.0 | 1.015 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 92.0 | 29.0 | 1.80 | 133.0 | 3.9 | 10.30 | 32.0 | 8000.0 | 4.8 | 1 | 0 | 0 | 0 | 0 | 0 |
| 382 | 48.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 75.0 | 22.0 | 0.80 | 137.0 | 5.0 | 16.80 | 51.0 | 6000.0 | 6.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 58 | 73.0 | 80.0 | 1.020 | 2.0 | 0.0 | 0 | 0 | 0 | 0 | 253.0 | 142.0 | 4.60 | 138.0 | 5.8 | 10.50 | 33.0 | 7200.0 | 4.3 | 1 | 1 | 1 | 0 | 0 | 0 |
| 41 | 45.0 | 70.0 | 1.010 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 20.0 | 0.70 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 1 | 0 |
| 270 | 23.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 111.0 | 34.0 | 1.10 | 145.0 | 4.0 | 14.30 | 41.0 | 7200.0 | 5.0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 50 | 53.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 91.0 | 114.0 | 3.25 | 142.0 | 4.3 | 8.60 | 28.0 | 11000.0 | 3.8 | 1 | 1 | 0 | 1 | 1 | 1 |
| 209 | 19.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 42.0 | 1.30 | 138.0 | 4.4 | 11.50 | 40.0 | 6900.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 317 | 58.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 102.0 | 48.0 | 1.20 | 139.0 | 4.3 | 15.00 | 40.0 | 8100.0 | 4.9 | 0 | 0 | 0 | 0 | 0 | 0 |
| 316 | 35.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 99.0 | 30.0 | 0.50 | 135.0 | 4.9 | 15.40 | 48.0 | 5000.0 | 5.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 331 | 59.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 130.0 | 39.0 | 0.70 | 147.0 | 4.7 | 13.50 | 46.0 | 6700.0 | 4.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 123 | 43.0 | 80.0 | 1.015 | 2.0 | 3.0 | 1 | 0 | 1 | 1 | 121.0 | 30.0 | 1.10 | 138.0 | 4.4 | 14.00 | 42.0 | 14900.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 222 | 74.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 108.0 | 68.0 | 1.80 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 62 | 15.0 | 60.0 | 1.020 | 3.0 | 0.0 | 1 | 1 | 0 | 0 | 86.0 | 15.0 | 0.60 | 138.0 | 4.0 | 11.00 | 33.0 | 7700.0 | 3.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 302 | 29.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 127.0 | 44.0 | 1.20 | 145.0 | 5.0 | 14.80 | 48.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 130 | 50.0 | 90.0 | 1.010 | 2.0 | 0.0 | 1 | 0 | 1 | 1 | 128.0 | 208.0 | 9.20 | 134.0 | 4.8 | 8.20 | 22.0 | 16300.0 | 2.7 | 0 | 0 | 0 | 1 | 1 | 1 |
| 187 | 3.0 | 80.0 | 1.010 | 2.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 22.0 | 0.70 | 138.0 | 4.4 | 10.70 | 34.0 | 12300.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 23 | 21.0 | 70.0 | 1.010 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 42.0 | 1.30 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 1 | 0 | 1 |
| 43 | 35.0 | 80.0 | 1.010 | 1.0 | 0.0 | 0 | 1 | 0 | 0 | 79.0 | 202.0 | 10.80 | 134.0 | 3.4 | 7.90 | 24.0 | 7900.0 | 3.1 | 0 | 1 | 0 | 0 | 0 | 0 |
| 0 | 48.0 | 80.0 | 1.020 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 36.0 | 1.20 | 138.0 | 4.4 | 15.40 | 44.0 | 7800.0 | 5.2 | 1 | 1 | 0 | 0 | 0 | 0 |
| 201 | 64.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 113.0 | 94.0 | 7.30 | 137.0 | 4.3 | 7.90 | 21.0 | 8000.0 | 4.8 | 1 | 1 | 1 | 0 | 1 | 1 |
| 339 | 25.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 88.0 | 42.0 | 0.50 | 136.0 | 3.5 | 13.30 | 48.0 | 7000.0 | 4.9 | 0 | 0 | 0 | 0 | 0 | 0 |
| 98 | 50.0 | 140.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 101.0 | 106.0 | 6.50 | 135.0 | 4.3 | 6.20 | 18.0 | 5800.0 | 2.3 | 1 | 1 | 0 | 1 | 0 | 1 |
| 387 | 15.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 93.0 | 17.0 | 0.90 | 136.0 | 3.9 | 16.70 | 50.0 | 6200.0 | 5.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 178 | 42.0 | 90.0 | 1.020 | 2.0 | 0.0 | 0 | 0 | 1 | 0 | 93.0 | 153.0 | 2.70 | 139.0 | 4.3 | 9.80 | 34.0 | 9800.0 | 4.8 | 0 | 0 | 0 | 1 | 1 | 1 |
| 256 | 60.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 131.0 | 10.0 | 0.50 | 146.0 | 5.0 | 14.50 | 41.0 | 10700.0 | 5.1 | 0 | 0 | 0 | 0 | 0 | 0 |
| 94 | 65.0 | 70.0 | 1.010 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 93.0 | 66.0 | 1.60 | 137.0 | 4.5 | 11.60 | 36.0 | 11900.0 | 3.9 | 0 | 1 | 0 | 0 | 0 | 0 |
| 369 | 75.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 107.0 | 48.0 | 0.80 | 144.0 | 3.5 | 13.60 | 46.0 | 10300.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 95 | 62.0 | 90.0 | 1.015 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 94.0 | 25.0 | 1.10 | 131.0 | 3.7 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 0 | 0 | 0 | 1 | 1 |
| 351 | 29.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 83.0 | 49.0 | 0.90 | 139.0 | 3.3 | 17.50 | 40.0 | 9900.0 | 4.7 | 0 | 0 | 0 | 0 | 0 | 0 |
| 169 | 55.0 | 70.0 | 1.010 | 0.0 | 2.0 | 1 | 1 | 0 | 0 | 220.0 | 68.0 | 2.80 | 138.0 | 4.4 | 8.70 | 27.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 1 |
| 69 | 26.0 | 70.0 | 1.015 | 0.0 | 4.0 | 1 | 1 | 0 | 0 | 250.0 | 20.0 | 1.10 | 138.0 | 4.4 | 15.60 | 52.0 | 6900.0 | 6.0 | 0 | 1 | 0 | 0 | 0 | 0 |
| 305 | 41.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 122.0 | 25.0 | 0.80 | 138.0 | 5.0 | 17.10 | 41.0 | 9100.0 | 5.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 48 | 73.0 | 70.0 | 1.005 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 70.0 | 32.0 | 0.90 | 125.0 | 4.0 | 10.00 | 29.0 | 18900.0 | 3.5 | 1 | 1 | 0 | 0 | 1 | 0 |
| 207 | 50.0 | 70.0 | 1.010 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 230.0 | 50.0 | 2.20 | 138.0 | 4.4 | 12.00 | 41.0 | 10400.0 | 4.6 | 1 | 1 | 0 | 0 | 0 | 0 |
| 279 | 24.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 140.0 | 23.0 | 0.60 | 140.0 | 4.7 | 16.30 | 48.0 | 5800.0 | 5.6 | 0 | 0 | 0 | 0 | 0 | 0 |
| 227 | 57.0 | 80.0 | 1.015 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 120.0 | 48.0 | 1.60 | 138.0 | 4.4 | 11.30 | 36.0 | 7200.0 | 3.8 | 1 | 1 | 0 | 0 | 0 | 0 |
| 148 | 69.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 171.0 | 26.0 | 48.10 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 0 | 0 | 1 | 0 | 0 |
| 143 | 41.0 | 80.0 | 1.015 | 1.0 | 4.0 | 0 | 1 | 0 | 0 | 210.0 | 165.0 | 18.00 | 135.0 | 4.7 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 1 | 0 | 0 | 0 | 0 |
| 180 | 73.0 | 90.0 | 1.010 | 1.0 | 4.0 | 0 | 0 | 1 | 0 | 234.0 | 56.0 | 1.90 | 138.0 | 4.4 | 10.30 | 28.0 | 8000.0 | 4.8 | 0 | 1 | 0 | 0 | 0 | 0 |
| 131 | 5.0 | 50.0 | 1.010 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 25.0 | 0.60 | 138.0 | 4.4 | 11.80 | 36.0 | 12400.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 357 | 66.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 107.0 | 16.0 | 1.10 | 140.0 | 3.6 | 13.60 | 42.0 | 11000.0 | 4.9 | 0 | 0 | 0 | 0 | 0 | 0 |
| 398 | 17.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 114.0 | 50.0 | 1.00 | 135.0 | 4.9 | 14.20 | 51.0 | 7200.0 | 5.9 | 0 | 0 | 0 | 0 | 0 | 0 |
| 262 | 55.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 118.0 | 18.0 | 0.90 | 135.0 | 3.6 | 15.50 | 43.0 | 7200.0 | 5.4 | 0 | 0 | 0 | 0 | 0 | 0 |
| 324 | 40.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 119.0 | 15.0 | 0.70 | 150.0 | 4.9 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 203 | 55.0 | 90.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 207.0 | 80.0 | 6.80 | 142.0 | 5.5 | 8.50 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 1 |
| 84 | 59.0 | 70.0 | 1.010 | 3.0 | 0.0 | 1 | 0 | 0 | 0 | 76.0 | 186.0 | 15.00 | 135.0 | 7.6 | 7.10 | 22.0 | 3800.0 | 2.1 | 1 | 0 | 0 | 1 | 1 | 1 |
| 121 | 54.0 | 60.0 | 1.020 | 3.0 | 0.0 | 1 | 1 | 0 | 0 | 125.0 | 21.0 | 1.30 | 137.0 | 3.4 | 15.00 | 46.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 1 | 0 |
| 345 | 22.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 97.0 | 18.0 | 1.20 | 138.0 | 4.3 | 13.50 | 42.0 | 7900.0 | 6.4 | 0 | 0 | 0 | 0 | 0 | 0 |
| 366 | 60.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 81.0 | 15.0 | 0.50 | 141.0 | 3.6 | 15.00 | 46.0 | 10500.0 | 5.3 | 0 | 0 | 0 | 0 | 0 | 0 |
| 91 | 56.0 | 70.0 | 1.015 | 4.0 | 1.0 | 0 | 1 | 0 | 0 | 210.0 | 26.0 | 1.70 | 136.0 | 3.8 | 16.10 | 52.0 | 12500.0 | 5.6 | 0 | 0 | 0 | 0 | 0 | 0 |
| 82 | 38.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 104.0 | 77.0 | 1.90 | 140.0 | 3.9 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 0 | 0 | 1 | 1 | 0 |
| 267 | 48.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 122.0 | 33.0 | 0.90 | 146.0 | 3.9 | 13.90 | 48.0 | 9500.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 119 | 60.0 | 70.0 | 1.010 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 140.0 | 27.0 | 1.20 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 291 | 47.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 124.0 | 44.0 | 1.00 | 140.0 | 4.9 | 14.90 | 41.0 | 7000.0 | 5.7 | 0 | 0 | 0 | 0 | 0 | 0 |
| 57 | 76.0 | 90.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 93.0 | 155.0 | 7.30 | 132.0 | 4.9 | 12.65 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 1 | 1 | 0 | 0 |
| 321 | 65.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 109.0 | 39.0 | 1.00 | 144.0 | 3.5 | 13.90 | 48.0 | 9600.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 257 | 38.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 91.0 | 36.0 | 0.70 | 135.0 | 3.7 | 14.00 | 46.0 | 9100.0 | 5.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 355 | 23.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 95.0 | 24.0 | 0.80 | 145.0 | 5.0 | 15.00 | 52.0 | 6300.0 | 4.6 | 0 | 0 | 0 | 0 | 0 | 0 |
| 42 | 47.0 | 100.0 | 1.010 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 204.0 | 29.0 | 1.00 | 139.0 | 4.2 | 9.70 | 33.0 | 9200.0 | 4.5 | 1 | 0 | 0 | 0 | 0 | 1 |
| 105 | 65.0 | 80.0 | 1.015 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 115.0 | 32.0 | 11.50 | 139.0 | 4.0 | 14.10 | 42.0 | 6800.0 | 5.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 368 | 30.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 82.0 | 42.0 | 0.70 | 146.0 | 5.0 | 14.90 | 45.0 | 9400.0 | 5.9 | 0 | 0 | 0 | 0 | 0 | 0 |
| 273 | 47.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 95.0 | 35.0 | 0.90 | 140.0 | 4.1 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 353 | 39.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 86.0 | 37.0 | 0.60 | 150.0 | 5.0 | 13.60 | 51.0 | 5800.0 | 4.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 38 | 69.0 | 80.0 | 1.020 | 3.0 | 0.0 | 0 | 1 | 0 | 0 | 121.0 | 103.0 | 4.10 | 132.0 | 5.9 | 12.50 | 40.0 | 8000.0 | 4.8 | 1 | 0 | 0 | 0 | 0 | 0 |
| 53 | 62.0 | 80.0 | 1.015 | 0.0 | 5.0 | 1 | 1 | 0 | 0 | 246.0 | 24.0 | 1.00 | 138.0 | 4.4 | 13.60 | 40.0 | 8500.0 | 4.7 | 1 | 1 | 0 | 0 | 0 | 0 |
| 347 | 43.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 108.0 | 25.0 | 1.00 | 144.0 | 5.0 | 17.80 | 43.0 | 7200.0 | 5.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 128 | 52.0 | 90.0 | 1.015 | 4.0 | 3.0 | 1 | 0 | 0 | 0 | 224.0 | 166.0 | 5.60 | 133.0 | 47.0 | 8.10 | 23.0 | 5000.0 | 2.9 | 1 | 1 | 0 | 0 | 0 | 1 |
| 290 | 54.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 76.0 | 28.0 | 0.60 | 146.0 | 3.5 | 14.80 | 52.0 | 8400.0 | 5.9 | 0 | 0 | 0 | 0 | 0 | 0 |
| 28 | 75.0 | 70.0 | 1.020 | 1.0 | 3.0 | 1 | 1 | 0 | 0 | 123.0 | 31.0 | 1.40 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 1 | 0 | 0 | 0 | 0 |
| 183 | 30.0 | 70.0 | 1.015 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 101.0 | 106.0 | 6.50 | 135.0 | 4.3 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 1 | 0 | 0 |
| 163 | 46.0 | 80.0 | 1.010 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 160.0 | 40.0 | 2.00 | 140.0 | 4.1 | 9.00 | 27.0 | 8100.0 | 3.2 | 1 | 0 | 0 | 1 | 0 | 1 |
| 151 | 76.0 | 90.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 172.0 | 46.0 | 1.70 | 141.0 | 5.5 | 9.60 | 30.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 0 | 1 |
| 244 | 64.0 | 90.0 | 1.015 | 3.0 | 2.0 | 1 | 0 | 1 | 0 | 463.0 | 64.0 | 2.80 | 135.0 | 4.1 | 12.20 | 40.0 | 9800.0 | 4.6 | 1 | 1 | 0 | 0 | 0 | 1 |
| 202 | 78.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 114.0 | 74.0 | 2.90 | 135.0 | 5.9 | 8.00 | 24.0 | 8000.0 | 4.8 | 0 | 1 | 0 | 0 | 0 | 1 |
| 31 | 73.0 | 90.0 | 1.015 | 3.0 | 0.0 | 1 | 0 | 1 | 0 | 107.0 | 33.0 | 1.50 | 141.0 | 4.6 | 10.10 | 30.0 | 7800.0 | 4.0 | 0 | 0 | 0 | 1 | 0 | 0 |
| 32 | 61.0 | 90.0 | 1.010 | 1.0 | 1.0 | 1 | 1 | 0 | 0 | 159.0 | 39.0 | 1.50 | 133.0 | 4.9 | 11.30 | 34.0 | 9600.0 | 4.0 | 1 | 1 | 0 | 1 | 0 | 0 |
| 127 | 71.0 | 60.0 | 1.015 | 4.0 | 0.0 | 1 | 1 | 0 | 0 | 118.0 | 125.0 | 5.30 | 136.0 | 4.9 | 11.40 | 35.0 | 15200.0 | 4.3 | 1 | 1 | 0 | 1 | 1 | 0 |
| 185 | 4.0 | 80.0 | 1.020 | 1.0 | 0.0 | 1 | 1 | 0 | 0 | 99.0 | 23.0 | 0.60 | 138.0 | 4.4 | 12.00 | 34.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 372 | 72.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 109.0 | 26.0 | 0.90 | 150.0 | 4.9 | 15.00 | 52.0 | 10500.0 | 5.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 288 | 56.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 70.0 | 46.0 | 1.20 | 135.0 | 4.9 | 15.90 | 50.0 | 11000.0 | 5.1 | 0 | 0 | 0 | 0 | 0 | 0 |
| 362 | 33.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 89.0 | 19.0 | 1.10 | 144.0 | 5.0 | 15.00 | 40.0 | 10300.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 147 | 60.0 | 60.0 | 1.010 | 3.0 | 1.0 | 1 | 0 | 1 | 0 | 288.0 | 36.0 | 1.70 | 130.0 | 3.0 | 7.90 | 25.0 | 15200.0 | 3.0 | 1 | 0 | 0 | 1 | 0 | 1 |
| 285 | 66.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 94.0 | 19.0 | 0.70 | 135.0 | 3.9 | 16.00 | 41.0 | 5300.0 | 5.9 | 0 | 0 | 0 | 0 | 0 | 0 |
| 370 | 69.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 83.0 | 42.0 | 1.20 | 139.0 | 3.7 | 16.20 | 50.0 | 9300.0 | 5.4 | 0 | 0 | 0 | 0 | 0 | 0 |
| 177 | 65.0 | 80.0 | 1.015 | 2.0 | 1.0 | 1 | 1 | 1 | 0 | 215.0 | 133.0 | 2.50 | 138.0 | 4.4 | 13.20 | 41.0 | 8000.0 | 4.8 | 0 | 1 | 0 | 0 | 0 | 0 |
| 99 | 56.0 | 180.0 | 1.020 | 0.0 | 4.0 | 1 | 0 | 0 | 0 | 298.0 | 24.0 | 1.20 | 139.0 | 3.9 | 11.20 | 32.0 | 10400.0 | 4.2 | 1 | 1 | 0 | 1 | 1 | 0 |
| 338 | 62.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 132.0 | 34.0 | 0.80 | 147.0 | 3.5 | 17.80 | 44.0 | 4700.0 | 4.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 335 | 60.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 134.0 | 45.0 | 0.50 | 139.0 | 4.8 | 14.20 | 48.0 | 10700.0 | 5.6 | 0 | 0 | 0 | 0 | 0 | 0 |
| 197 | 57.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 111.0 | 9.30 | 124.0 | 5.3 | 6.80 | 40.0 | 4300.0 | 3.0 | 1 | 1 | 0 | 0 | 0 | 1 |
| 243 | 62.0 | 90.0 | 1.020 | 2.0 | 1.0 | 1 | 1 | 0 | 0 | 169.0 | 48.0 | 2.40 | 138.0 | 2.9 | 13.40 | 47.0 | 11000.0 | 6.1 | 1 | 0 | 0 | 0 | 0 | 0 |
| 115 | 47.0 | 80.0 | 1.010 | 0.0 | 0.0 | 1 | 0 | 0 | 0 | 121.0 | 28.0 | 0.90 | 138.0 | 4.4 | 12.40 | 44.0 | 5600.0 | 4.3 | 0 | 0 | 0 | 0 | 0 | 1 |
| 265 | 50.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 97.0 | 40.0 | 0.60 | 150.0 | 4.5 | 14.20 | 48.0 | 10500.0 | 5.0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 72 | 64.0 | 90.0 | 1.010 | 3.0 | 3.0 | 1 | 0 | 1 | 0 | 121.0 | 35.0 | 1.30 | 138.0 | 4.4 | 10.30 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 0 | 0 | 1 | 0 |
| 333 | 23.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 99.0 | 46.0 | 1.20 | 142.0 | 4.0 | 17.70 | 46.0 | 4300.0 | 5.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 25 | 61.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 108.0 | 75.0 | 1.90 | 141.0 | 5.2 | 9.90 | 29.0 | 8400.0 | 3.7 | 1 | 1 | 0 | 0 | 0 | 1 |
| 165 | 60.0 | 80.0 | 1.020 | 0.0 | 2.0 | 1 | 1 | 0 | 0 | 121.0 | 42.0 | 1.30 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 1 | 0 | 0 | 0 | 0 |
| 337 | 44.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 92.0 | 40.0 | 0.90 | 141.0 | 4.9 | 14.00 | 52.0 | 7500.0 | 6.2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 384 | 57.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 132.0 | 18.0 | 1.10 | 150.0 | 4.7 | 15.40 | 42.0 | 11000.0 | 4.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 174 | 54.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 111.0 | 146.0 | 7.50 | 141.0 | 4.7 | 11.00 | 35.0 | 8600.0 | 4.6 | 0 | 0 | 0 | 0 | 0 | 0 |
| 386 | 46.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 100.0 | 47.0 | 0.50 | 142.0 | 3.5 | 16.40 | 43.0 | 5700.0 | 6.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 39 | 82.0 | 80.0 | 1.010 | 2.0 | 2.0 | 1 | 1 | 0 | 0 | 140.0 | 70.0 | 3.40 | 136.0 | 4.2 | 13.00 | 40.0 | 9800.0 | 4.2 | 1 | 1 | 0 | 0 | 0 | 0 |
| 193 | 32.0 | 90.0 | 1.025 | 1.0 | 0.0 | 0 | 0 | 0 | 0 | 121.0 | 223.0 | 18.10 | 113.0 | 6.5 | 5.50 | 15.0 | 2600.0 | 2.8 | 1 | 1 | 0 | 1 | 1 | 1 |
| 314 | 39.0 | 70.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 131.0 | 46.0 | 0.60 | 145.0 | 5.0 | 15.60 | 41.0 | 9400.0 | 4.7 | 0 | 0 | 0 | 0 | 0 | 0 |
| 88 | 58.0 | 110.0 | 1.010 | 4.0 | 0.0 | 1 | 1 | 0 | 0 | 251.0 | 52.0 | 2.20 | 138.0 | 4.4 | 12.65 | 40.0 | 13200.0 | 4.7 | 1 | 1 | 0 | 0 | 0 | 0 |
| 70 | 61.0 | 80.0 | 1.015 | 0.0 | 4.0 | 1 | 1 | 0 | 0 | 360.0 | 19.0 | 0.70 | 137.0 | 4.4 | 15.20 | 44.0 | 8300.0 | 5.2 | 1 | 1 | 0 | 0 | 0 | 0 |
| 87 | 70.0 | 100.0 | 1.005 | 1.0 | 0.0 | 1 | 0 | 1 | 0 | 169.0 | 47.0 | 2.90 | 138.0 | 4.4 | 11.10 | 32.0 | 5800.0 | 5.0 | 1 | 1 | 0 | 1 | 0 | 0 |
| 292 | 30.0 | 80.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 89.0 | 42.0 | 0.50 | 139.0 | 5.0 | 16.70 | 52.0 | 10200.0 | 5.0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 242 | 69.0 | 70.0 | 1.010 | 4.0 | 3.0 | 1 | 0 | 1 | 1 | 214.0 | 96.0 | 6.30 | 120.0 | 3.9 | 9.40 | 28.0 | 11500.0 | 3.3 | 1 | 1 | 1 | 0 | 1 | 1 |
| 277 | 46.0 | 60.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 123.0 | 46.0 | 1.00 | 135.0 | 5.0 | 15.70 | 50.0 | 6300.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 211 | 54.0 | 120.0 | 1.015 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 103.0 | 18.0 | 1.20 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 9 | 53.0 | 90.0 | 1.020 | 2.0 | 0.0 | 0 | 0 | 1 | 0 | 70.0 | 107.0 | 7.20 | 114.0 | 3.7 | 9.50 | 29.0 | 12100.0 | 3.7 | 1 | 1 | 0 | 1 | 0 | 1 |
| 359 | 74.0 | 60.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 88.0 | 50.0 | 0.60 | 147.0 | 3.7 | 17.20 | 53.0 | 6000.0 | 4.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 195 | 70.0 | 90.0 | 1.020 | 2.0 | 1.0 | 0 | 0 | 0 | 1 | 184.0 | 98.6 | 3.30 | 138.0 | 3.9 | 5.80 | 40.0 | 8000.0 | 4.8 | 1 | 1 | 1 | 1 | 0 | 0 |
| 251 | 23.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 70.0 | 36.0 | 1.00 | 150.0 | 4.6 | 17.00 | 52.0 | 9800.0 | 5.0 | 0 | 0 | 0 | 0 | 0 | 0 |
| 323 | 43.0 | 80.0 | 1.025 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 130.0 | 30.0 | 1.10 | 143.0 | 5.0 | 15.90 | 45.0 | 7800.0 | 4.5 | 0 | 0 | 0 | 0 | 0 | 0 |
| 192 | 46.0 | 110.0 | 1.015 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 130.0 | 16.0 | 0.90 | 138.0 | 4.4 | 12.65 | 40.0 | 8000.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 117 | 55.0 | 70.0 | 1.020 | 0.0 | 0.0 | 1 | 1 | 0 | 0 | 219.0 | 36.0 | 1.30 | 139.0 | 3.7 | 12.50 | 37.0 | 9800.0 | 4.4 | 0 | 0 | 0 | 0 | 0 | 0 |
| 47 | 11.0 | 80.0 | 1.010 | 3.0 | 0.0 | 1 | 1 | 0 | 0 | 121.0 | 17.0 | 0.80 | 138.0 | 4.4 | 15.00 | 45.0 | 8600.0 | 4.8 | 0 | 0 | 0 | 0 | 0 | 0 |
| 172 | 62.0 | 80.0 | 1.010 | 1.0 | 2.0 | 1 | 1 | 0 | 0 | 309.0 | 113.0 | 2.90 | 130.0 | 2.5 | 10.60 | 34.0 | 12800.0 | 4.9 | 0 | 0 | 0 | 0 | 0 | 0 |
# scores for the features
for i in range(len(ordered_feature.scores_)):
print( 'Feature %s: %f' % (no_scale_X_train.columns[i], ordered_feature.scores_[i])) # plot the scores
plt.bar([i for i in range(len(ordered_feature.scores_))], ordered_feature.scores_)
plt.show()
Feature age: 109.358574 Feature blood_pressure: 56.278555 Feature specific_gravity: 0.003129 Feature albumin: 139.820225 Feature sugar: 61.314607 Feature red_blood_cells: 1.920005 Feature pus_cell: 7.393656 Feature pus_cell_clumps: 16.617978 Feature bacteria: 7.449438 Feature blood_glucose_random: 1556.035234 Feature blood_urea: 1289.937495 Feature serum_creatinine: 187.996095 Feature sodium: 13.576238 Feature potassium: 3.067875 Feature haemoglobin: 74.887437 Feature packed_cell_volume: 187.622622 Feature white_blood_cell_count: 4482.392254 Feature red_blood_cell_count: 10.259505 Feature ypertension: 61.314607 Feature diabetes_mellitus: 58.449438 Feature coronary_artery_disease: 12.606742 Feature appetite: 33.808989 Feature pedal_edema: 32.089888 Feature anemia: 21.202247
The first model you make may not be a good one. You need to improve the model.
In majority of the classification problems, the target class is imbalanced. So you need to balance it in order to get best modelling results.
In this section you will:
Imbalanced classes are a common problem in machine learning classification where there are a disproportionate ratio of observations in each class.
Most machine learning algorithms work best when the number of samples in each class are about equal. This is because most algorithms are designed to maximize accuracy and reduce error.
Here, you will upsample the minority class
# Over sample the minority class
ros = RandomOverSampler()
X_ros, y_ros = ros.fit_resample(X, y)
y_ros.value_counts()
0 250 1 250 Name: class, dtype: int64
# Define the function to build model on balanced dataset
def classification_model(X, y):
scaled_X = scale_data(X)
# Split the dataset into the training set and test set
X_train, X_test, y_train, y_test = train_test_split(scaled_X, y, test_size = 0.3, random_state = 0)
# Training the model:
model.fit(X_train, y_train)
# Predict class for test dataset
y_pred = model.predict(X_test)
# Predict probability for test dataset
y_pred_prod = model.predict_proba(X_test)
y_pred_prod = [x[1] for x in y_pred_prod]
# Compute Evaluation Metric
compute_evaluation_metric(model, X_test, y_test, y_pred, y_pred_prod)
return model
# Build model on balanced data and get evaluation metrics
# run balanced evaluation metrics on all models
for name, model in models:
print(name)
classification_model(X_ros, y_ros)
LR
Accuracy Score :
0.98
AUC Score :
0.9998221906116643
Confusion Matrix :
[[71 3]
[ 0 76]]
Classification Report :
precision recall f1-score support
0 1.00 0.96 0.98 74
1 0.96 1.00 0.98 76
accuracy 0.98 150
macro avg 0.98 0.98 0.98 150
weighted avg 0.98 0.98 0.98 150
ROC curve :
LDA
Accuracy Score :
0.9666666666666667
AUC Score :
0.9989331436699859
Confusion Matrix :
[[69 5]
[ 0 76]]
Classification Report :
precision recall f1-score support
0 1.00 0.93 0.97 74
1 0.94 1.00 0.97 76
accuracy 0.97 150
macro avg 0.97 0.97 0.97 150
weighted avg 0.97 0.97 0.97 150
ROC curve :
KNN
Accuracy Score :
0.9666666666666667
AUC Score :
0.9932432432432433
Confusion Matrix :
[[69 5]
[ 0 76]]
Classification Report :
precision recall f1-score support
0 1.00 0.93 0.97 74
1 0.94 1.00 0.97 76
accuracy 0.97 150
macro avg 0.97 0.97 0.97 150
weighted avg 0.97 0.97 0.97 150
ROC curve :
CART
Accuracy Score :
0.98
AUC Score :
0.9802631578947368
Confusion Matrix :
[[74 0]
[ 3 73]]
Classification Report :
precision recall f1-score support
0 0.96 1.00 0.98 74
1 1.00 0.96 0.98 76
accuracy 0.98 150
macro avg 0.98 0.98 0.98 150
weighted avg 0.98 0.98 0.98 150
ROC curve :
RFC
Accuracy Score :
1.0
AUC Score :
1.0
Confusion Matrix :
[[74 0]
[ 0 76]]
Classification Report :
precision recall f1-score support
0 1.00 1.00 1.00 74
1 1.00 1.00 1.00 76
accuracy 1.00 150
macro avg 1.00 1.00 1.00 150
weighted avg 1.00 1.00 1.00 150
ROC curve :
XGB
[19:03:34] WARNING: /Users/runner/miniforge3/conda-bld/xgboost-split_1643226991592/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior.
Accuracy Score :
0.9933333333333333
AUC Score :
1.0
Confusion Matrix :
[[74 0]
[ 1 75]]
Classification Report :
precision recall f1-score support
0 0.99 1.00 0.99 74
1 1.00 0.99 0.99 76
accuracy 0.99 150
macro avg 0.99 0.99 0.99 150
weighted avg 0.99 0.99 0.99 150
ROC curve :
NB
Accuracy Score :
0.98
AUC Score :
0.9797297297297297
Confusion Matrix :
[[71 3]
[ 0 76]]
Classification Report :
precision recall f1-score support
0 1.00 0.96 0.98 74
1 0.96 1.00 0.98 76
accuracy 0.98 150
macro avg 0.98 0.98 0.98 150
weighted avg 0.98 0.98 0.98 150
ROC curve :
LGB
Accuracy Score :
0.9933333333333333
AUC Score :
1.0
Confusion Matrix :
[[73 1]
[ 0 76]]
Classification Report :
precision recall f1-score support
0 1.00 0.99 0.99 74
1 0.99 1.00 0.99 76
accuracy 0.99 150
macro avg 0.99 0.99 0.99 150
weighted avg 0.99 0.99 0.99 150
ROC curve :
Hyperparameter is a parameter whose value is set before the learning process begins
Hyperparameter tuning refers to the automatic optimization of the hyper-parameters of a ML model
# Split the dataset into the training set and test set
X_train, X_test, y_train, y_test = train_test_split(X_ros, y_ros, test_size = 0.3, random_state = 0)
# Define the parameters gird for decision tree
param_grid_decision_tree = {'criterion': ['gini', 'entropy'],
'max_depth': [10,15,20,30,40,50],
'min_samples_leaf' : [1,2,5]
}
# Define the parameters gird for random forest
param_grid_random_forest = {'max_depth' : [10,20,40],
'n_estimators' : [100,200,300],
'min_samples_leaf' : [1,2,5]
}
# Define the parameters gird for XGBoost
param_grid_xgb = {'min_child_weight': [1, 5, 10],
'gamma': [0, 1],
'max_depth': [5,10],
'learning_rate' : [0.05,0.1]
}
# Define the parameters gird for LGBM
param_grid_lgbm = {'n_estimator':[100,200],
'num_leaves': [256,128],
'max_depth': [5, 8, 10],
'learning_rate': [0.05, 0.1]
}
def grid_model(X, y):
scaled_X = scale_data(X)
# Split the dataset into the training set and test set
X_train, X_test, y_train, y_test = train_test_split(scaled_X, y, test_size = 0.3, random_state = 0)
# Run grid search for lgbm
model = LGBMClassifier()
param_grid = param_grid_lgbm
grid = GridSearchCV(model, param_grid, refit = True, verbose = 3, n_jobs = -1)
# fit the model for grid search
grid.fit(X_train, y_train)
# Predict class for test dataset
y_pred = grid.predict(X_test)
# Predict probability for test dataset
y_pred_prod = grid.predict_proba(X_test)
y_pred_prod = [x[1] for x in y_pred_prod]
print("Y predicted : ",y_pred)
print("Y probability predicted : ",y_pred_prod[:5])
# Compute Evaluation Metric
compute_evaluation_metric(grid, X_test, y_test, y_pred, y_pred_prod)
# save the model to disk
filename = 'final_model.sav'
pickle.dump(grid.best_estimator_, open(filename, 'wb'))
return grid
# Run random search for lgbm
model = LGBMClassifier()
param_rdn = param_grid_lgbm
def random_search(X, y):
scaled_X = scale_data(X)
# Split the dataset into the training set and test set
X_train, X_test, y_train, y_test = train_test_split(scaled_X, y, test_size = 0.3, random_state = 0)
# Run grid search for lgbm
model = LGBMClassifier()
param_grid = param_grid_lgbm
random_search=RandomizedSearchCV(model,param_distributions=param_rdn,n_iter=5,scoring='roc_auc',n_jobs=-1,cv=5,verbose=3)
# fit the model for random search
random_search.fit(X_train, y_train)
print(random_search.best_estimator_)
print(random_search.best_params_)
# Predict class for test dataset
y_pred=random_search.predict(X_test)
# Predict probability for test dataset
y_pred_prod = random_search.predict_proba(X_test)
y_pred_prod = [x[1] for x in y_pred_prod]
print("Y predicted : ",y_pred)
print("Y probability predicted : ",y_pred_prod[:5])
# Compute Evaluation Metric
compute_evaluation_metric(random_search, X_test, y_test, y_pred, y_pred_prod)
return random_search
random_search(X_ros, y_ros)
Fitting 5 folds for each of 5 candidates, totalling 25 fits
[LightGBM] [Warning] Unknown parameter: n_estimator
LGBMClassifier(max_depth=10, n_estimator=100, num_leaves=256)
{'num_leaves': 256, 'n_estimator': 100, 'max_depth': 10, 'learning_rate': 0.1}
Y predicted : [0 1 1 1 1 0 1 1 0 0 0 1 1 1 0 1 1 1 0 0 0 1 0 0 0 1 0 1 0 0 1 0 0 0 1 1 1
1 1 0 1 0 1 0 0 0 0 1 1 0 1 0 1 0 0 0 1 0 0 1 0 1 1 1 1 0 0 1 1 0 0 0 1 1
0 1 1 0 1 1 1 0 1 0 1 1 1 0 1 0 0 1 1 0 0 1 0 1 0 0 0 1 0 1 0 1 0 0 1 0 1
0 0 0 1 1 1 1 0 0 1 0 0 1 1 0 0 1 1 1 1 0 1 0 1 1 1 1 1 1 0 0 0 0 1 0 1 1
0 0]
Y probability predicted : [2.0412716217308192e-05, 0.9999086326890452, 0.9991837306184703, 0.9999339397931517, 0.9999720892542188]
Accuracy Score :
0.9933333333333333
AUC Score :
1.0
Confusion Matrix :
[[73 1]
[ 0 76]]
Classification Report :
precision recall f1-score support
0 1.00 0.99 0.99 74
1 0.99 1.00 0.99 76
accuracy 0.99 150
macro avg 0.99 0.99 0.99 150
weighted avg 0.99 0.99 0.99 150
ROC curve :
RandomizedSearchCV(cv=5, estimator=LGBMClassifier(), n_iter=5, n_jobs=-1,
param_distributions={'learning_rate': [0.05, 0.1],
'max_depth': [5, 8, 10],
'n_estimator': [100, 200],
'num_leaves': [256, 128]},
scoring='roc_auc', verbose=3)
[LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.05, max_depth=8, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.1, max_depth=8, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.1, max_depth=10, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.1, max_depth=8, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.1, max_depth=5, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.1, max_depth=5, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.1, max_depth=5, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.05, max_depth=10, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.05, max_depth=8, n_estimator=200, num_leaves=256;, score=0.999 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.1, max_depth=8, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.1, max_depth=10, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.1, max_depth=8, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.05, max_depth=8, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.1, max_depth=8, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.1, max_depth=5, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=256;, score=0.999 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.1, max_depth=8, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.1, max_depth=8, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.05, max_depth=8, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.1, max_depth=8, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.1, max_depth=5, n_estimator=200, num_leaves=256;, score=0.999 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.1, max_depth=8, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.05, max_depth=10, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.1, max_depth=8, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.1, max_depth=8, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.1, max_depth=10, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.1, max_depth=5, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.1, max_depth=8, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.05, max_depth=8, n_estimator=200, num_leaves=128;, score=0.999 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.1, max_depth=8, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.05, max_depth=8, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=128;, score=0.999 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.1, max_depth=8, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=128;, score=0.999 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.1, max_depth=10, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.1, max_depth=8, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.05, max_depth=8, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.1, max_depth=8, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.1, max_depth=5, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.1, max_depth=8, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.1, max_depth=5, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.05, max_depth=10, n_estimator=200, num_leaves=256;, score=0.999 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.05, max_depth=10, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.05, max_depth=8, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.1, max_depth=8, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.1, max_depth=10, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.1, max_depth=8, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.1, max_depth=8, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.1, max_depth=5, n_estimator=200, num_leaves=128;, score=0.999 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.05, max_depth=10, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.1, max_depth=8, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.05, max_depth=8, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.1, max_depth=8, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.05, max_depth=8, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.1, max_depth=8, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.1, max_depth=8, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=256;, score=0.999 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.1, max_depth=5, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.1, max_depth=8, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s
grid_model(X_ros, y_ros)
Fitting 5 folds for each of 24 candidates, totalling 120 fits
[LightGBM] [Warning] Unknown parameter: n_estimator
Y predicted : ['ckd' 'notckd' 'notckd' 'notckd' 'notckd' 'ckd' 'notckd' 'notckd' 'ckd'
'ckd' 'ckd' 'notckd' 'notckd' 'notckd' 'ckd' 'notckd' 'notckd' 'notckd'
'ckd' 'ckd' 'ckd' 'notckd' 'ckd' 'ckd' 'ckd' 'notckd' 'ckd' 'notckd'
'ckd' 'ckd' 'notckd' 'ckd' 'ckd' 'ckd' 'notckd' 'notckd' 'notckd'
'notckd' 'notckd' 'ckd' 'notckd' 'ckd' 'notckd' 'ckd' 'ckd' 'ckd' 'ckd'
'notckd' 'notckd' 'ckd' 'notckd' 'ckd' 'notckd' 'ckd' 'ckd' 'ckd'
'notckd' 'ckd' 'ckd' 'notckd' 'ckd' 'notckd' 'notckd' 'notckd' 'notckd'
'ckd' 'ckd' 'notckd' 'notckd' 'ckd' 'ckd' 'ckd' 'notckd' 'notckd' 'ckd'
'notckd' 'notckd' 'ckd' 'notckd' 'notckd' 'notckd' 'ckd' 'notckd' 'ckd'
'notckd' 'notckd' 'notckd' 'ckd' 'notckd' 'ckd' 'ckd' 'notckd' 'notckd'
'ckd' 'ckd' 'notckd' 'ckd' 'notckd' 'ckd' 'ckd' 'ckd' 'notckd' 'ckd'
'notckd' 'ckd' 'notckd' 'ckd' 'ckd' 'notckd' 'ckd' 'notckd' 'ckd' 'ckd'
'ckd' 'notckd' 'notckd' 'notckd' 'notckd' 'ckd' 'ckd' 'notckd' 'ckd'
'ckd' 'notckd' 'notckd' 'ckd' 'ckd' 'notckd' 'notckd' 'notckd' 'notckd'
'ckd' 'notckd' 'ckd' 'notckd' 'notckd' 'notckd' 'notckd' 'notckd'
'notckd' 'ckd' 'ckd' 'ckd' 'ckd' 'notckd' 'ckd' 'notckd' 'notckd' 'ckd'
'ckd']
Y probability predicted : [4.09211811387175e-05, 0.999891535765397, 0.9993988184867179, 0.9998720536713491, 0.9999554325650816]
Accuracy Score :
0.9933333333333333
AUC Score :
1.0
Confusion Matrix :
[[73 1]
[ 0 76]]
Classification Report :
precision recall f1-score support
ckd 1.00 0.99 0.99 74
notckd 0.99 1.00 0.99 76
accuracy 0.99 150
macro avg 0.99 0.99 0.99 150
weighted avg 0.99 0.99 0.99 150
ROC curve :
GridSearchCV(estimator=LGBMClassifier(), n_jobs=-1,
param_grid={'learning_rate': [0.05, 0.1], 'max_depth': [5, 8, 10],
'n_estimator': [100, 200], 'num_leaves': [256, 128]},
verbose=3)
# load the model from disk
loaded_model = pickle.load(open(filename, 'rb'))
loaded_model
LGBMClassifier(max_depth=5, n_estimator=100, num_leaves=256)
[LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.05, max_depth=5, n_estimator=100, num_leaves=128;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.05, max_depth=8, n_estimator=100, num_leaves=128;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.05, max_depth=8, n_estimator=200, num_leaves=256;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.05, max_depth=8, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.05, max_depth=10, n_estimator=100, num_leaves=128;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.1, max_depth=5, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.1, max_depth=5, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.1, max_depth=8, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.1, max_depth=8, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.1, max_depth=8, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.1, max_depth=10, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.05, max_depth=5, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.05, max_depth=8, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.05, max_depth=8, n_estimator=200, num_leaves=128;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.05, max_depth=10, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.1, max_depth=5, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.1, max_depth=5, n_estimator=100, num_leaves=256;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.1, max_depth=5, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.1, max_depth=5, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.1, max_depth=8, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.1, max_depth=10, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.1, max_depth=10, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.05, max_depth=5, n_estimator=100, num_leaves=128;, score=0.986 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=256;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.05, max_depth=8, n_estimator=100, num_leaves=256;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.05, max_depth=8, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.05, max_depth=10, n_estimator=100, num_leaves=256;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.05, max_depth=10, n_estimator=200, num_leaves=256;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.05, max_depth=10, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.1, max_depth=8, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.1, max_depth=8, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.1, max_depth=10, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.1, max_depth=10, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.05, max_depth=5, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=128;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.05, max_depth=8, n_estimator=100, num_leaves=128;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.05, max_depth=8, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.05, max_depth=10, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.05, max_depth=10, n_estimator=200, num_leaves=128;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.05, max_depth=10, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.1, max_depth=8, n_estimator=100, num_leaves=128;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.1, max_depth=8, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.1, max_depth=8, n_estimator=200, num_leaves=128;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.1, max_depth=10, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.1, max_depth=10, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.05, max_depth=5, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.05, max_depth=8, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.05, max_depth=8, n_estimator=200, num_leaves=256;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.05, max_depth=10, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.1, max_depth=5, n_estimator=100, num_leaves=128;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.1, max_depth=5, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.1, max_depth=8, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.1, max_depth=8, n_estimator=100, num_leaves=256;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.1, max_depth=8, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.1, max_depth=10, n_estimator=100, num_leaves=128;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.1, max_depth=10, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.05, max_depth=5, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.05, max_depth=8, n_estimator=200, num_leaves=128;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.05, max_depth=10, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.1, max_depth=5, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.1, max_depth=5, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.1, max_depth=8, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.1, max_depth=8, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.1, max_depth=8, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.1, max_depth=10, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.1, max_depth=10, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.05, max_depth=5, n_estimator=100, num_leaves=256;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=128;, score=0.986 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.05, max_depth=8, n_estimator=100, num_leaves=256;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.05, max_depth=8, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.05, max_depth=10, n_estimator=100, num_leaves=128;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.05, max_depth=10, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.05, max_depth=10, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.1, max_depth=5, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.1, max_depth=5, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.1, max_depth=8, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.1, max_depth=10, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.1, max_depth=10, n_estimator=200, num_leaves=256;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.05, max_depth=5, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=256;, score=0.986 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.05, max_depth=8, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.05, max_depth=8, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.05, max_depth=10, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.05, max_depth=10, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.05, max_depth=10, n_estimator=200, num_leaves=128;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.1, max_depth=5, n_estimator=200, num_leaves=128;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.1, max_depth=5, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.1, max_depth=8, n_estimator=200, num_leaves=256;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.1, max_depth=10, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.1, max_depth=10, n_estimator=200, num_leaves=128;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.05, max_depth=5, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.05, max_depth=5, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.05, max_depth=8, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.05, max_depth=8, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.05, max_depth=10, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.05, max_depth=10, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.05, max_depth=10, n_estimator=200, num_leaves=256;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.1, max_depth=5, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.1, max_depth=5, n_estimator=200, num_leaves=256;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.1, max_depth=8, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.1, max_depth=10, n_estimator=100, num_leaves=256;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.1, max_depth=10, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.1, max_depth=10, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.05, max_depth=5, n_estimator=100, num_leaves=256;, score=0.986 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.05, max_depth=8, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 5/5] END learning_rate=0.05, max_depth=8, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 4/5] END learning_rate=0.05, max_depth=10, n_estimator=100, num_leaves=256;, score=0.971 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.1, max_depth=5, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.1, max_depth=5, n_estimator=100, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.1, max_depth=5, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 2/5] END learning_rate=0.1, max_depth=5, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.1, max_depth=8, n_estimator=200, num_leaves=256;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.1, max_depth=8, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 1/5] END learning_rate=0.1, max_depth=10, n_estimator=100, num_leaves=128;, score=1.000 total time= 0.0s [LightGBM] [Warning] Unknown parameter: n_estimator [CV 3/5] END learning_rate=0.1, max_depth=10, n_estimator=200, num_leaves=128;, score=1.000 total time= 0.0s